MCP Snowflake Server NSP
A Snowflake MCP server — SQL queries, schema exploration, and data insights for AI assistants
Snowflake MCP Server
A Model Context Protocol (MCP) server / MCP server that connects AI assistants to Snowflake — enabling SQL queries, schema exploration, and data insights directly from your LLM client.
Highlights:
- Multiple authentication methods: password, key-pair, external browser, OAuth 2.0 (client credentials & bearer token), TOML connection files
- TOML multi-connection config — manage
production,staging, anddevelopmentenvironments in one file - Write-safety guard — write operations are disabled by default and must be explicitly enabled
- Exclusion patterns — filter out databases, schemas, or tables from discovery
--exclude-json-resultsflag — reduces LLM context window usage- Selective tool exclusion via
--exclude_tools - Prefetch mode — pre-load table schema as MCP resources
- Docker support
Table of Contents
- Snowflake MCP Server
Quick Start
The fastest way to try it — using uvx with a TOML connection file:
# 1. Create a connections file
cat > ~/snowflake_connections.toml << 'EOF'
[myconn]
account = "your_account"
user = "your_user"
password = "your_password"
warehouse = "COMPUTE_WH"
database = "MY_DB"
schema = "PUBLIC"
role = "MYROLE"
EOF
# 2. Run the server
uvx --python=3.13 --from mcp-snowflake-server-nsp mcp_snowflake_server \
--connections-file ~/snowflake_connections.toml \
--connection-name myconn
Claude Code
Add to your MCP client config (e.g. claude_desktop_config.json) using snowflake_connections.toml:
"mcpServers": {
"snowflake": {
"command": "uvx",
"args": [
"--python=3.13",
"--from", "mcp-snowflake-server-nsp",
"mcp_snowflake_server",
"--connections-file", "/absolute/path/to/snowflake_connections.toml",
"--connection-name", "myconn"
]
}
}
Visual Studio Code (VSCode)
Add to your MCP client config (e.g. .vscode/mcp.json) using .env file (see Authentication):
"snowflake": {
// Snowflake MCP server
"type": "stdio",
"command": "uvx",
"args": [
"--from", "mcp-snowflake-server-nsp",
"--python=3.13",
"mcp_snowflake_server"
],
"envFile": "${workspaceFolder}/.env"
}
OpenCode
Add to your MCP client config (e.g. opencode.jsonc) with .env file (see Authentication):
"snowflake": {
"type": "local",
"command": [
"uvx",
"--from",
"mcp-snowflake-server-nsp",
"--python=3.13",
"mcp_snowflake_server",
],
"enabled": true,
"timeout": 300000,
}
Components
Resources
| URI | Description |
|---|---|
memo://insights | A continuously updated memo aggregating data insights appended via append_insight. |
context://table/{table_name} | (Prefetch mode only) Per-table schema summaries including columns and comments. |
Tools
Query Tools
| Tool | Description | Requires |
|---|---|---|
read_query | Execute SELECT queries. Input: query (string). | — |
write_query | Execute INSERT, UPDATE, or DELETE queries. Input: query (string). | --allow_write |
create_table | Execute CREATE TABLE statements. Input: query (string). | --allow_write |
Schema Tools
| Tool | Description | Input |
|---|---|---|
list_databases | List all databases in the Snowflake instance. | — |
list_schemas | List all schemas within a database. | database (string) |
list_tables | List all tables within a database and schema. | database, schema (strings) |
describe_table | Describe columns of a table (name, type, nullability, default, comment). | table_name as database.schema.table |
Analysis Tools
| Tool | Description | Input |
|---|---|---|
append_insight | Add a data insight to the memo://insights resource. | insight (string) |
Authentication
Password
Set credentials via environment variables or CLI flags (see Configuration Reference):
SNOWFLAKE_USER="[email protected]"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_AUTHENTICATOR="snowflake"
SNOWFLAKE_PASSWORD="secret"
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"
Key-Pair
SNOWFLAKE_USER="[email protected]"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_AUTHENTICATOR="snowflake_jwt"
SNOWFLAKE_PRIVATE_KEY_FILE="/absolute/path/to/key.p8"
SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase" # Optional — only if key is encrypted
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"
Or via CLI: --private_key_file /path/to/key.p8 --private_key_file_pwd passphrase
External Browser
SNOWFLAKE_AUTHENTICATOR="externalbrowser"
Or in a TOML connection entry: authenticator = "externalbrowser"
OAuth 2.0 Client Credentials
Use the OAuth 2.0 client credentials flow to authenticate with a client ID and secret (no user interaction required):
SNOWFLAKE_AUTHENTICATOR="oauth_client_credentials"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_OAUTH_CLIENT_ID="your_client_id"
SNOWFLAKE_OAUTH_CLIENT_SECRET="your_client_secret"
SNOWFLAKE_OAUTH_TOKEN_REQUEST_URL="https://your-idp.example.com/oauth/token"
SNOWFLAKE_OAUTH_SCOPE="session:role:MY_ROLE" # Optional
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"
OAuth Bearer Token
Use a pre-fetched OAuth bearer token:
SNOWFLAKE_AUTHENTICATOR="oauth"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_TOKEN="eyJhbGciOiJSUzI1NiJ9..."
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"
TOML Connection File (Recommended)
Manage multiple environments in a single file. See example_connections.toml for a full template.
[production]
account = "your_account"
user = "your_user"
password = "your_password"
authenticator = "snowflake"
warehouse = "COMPUTE_WH"
database = "PROD_DB"
schema = "PUBLIC"
role = "ACCOUNTADMIN"
[development]
account = "your_account"
user = "dev_user"
authenticator = "externalbrowser"
warehouse = "DEV_WH"
database = "DEV_DB"
schema = "PUBLIC"
role = "DEVELOPER"
[reporting]
account = "your_account"
user = "reporting_user"
authenticator = "snowflake_jwt"
private_key_file = "/path/to/private_key.pem"
private_key_file_pwd = "passphrase" # Optional
warehouse = "REPORTING_WH"
database = "REPORTING_DB"
schema = "REPORTS"
role = "REPORTING_ROLE"
[analytics_oauth]
account = "your_account"
authenticator = "oauth_client_credentials"
oauth_client_id = "your_client_id"
oauth_client_secret = "your_client_secret"
oauth_token_request_url = "https://your-idp.example.com/oauth/token"
oauth_scope = "session:role:ANALYTICS_ROLE" # Optional
warehouse = "ANALYTICS_WH"
database = "ANALYTICS_DB"
schema = "PUBLIC"
role = "ANALYTICS_ROLE"
Pass the file with --connections-file and select a profile with --connection-name. Both flags are required together.
Installation
The package is published on PyPI as mcp-snowflake-server-nsp.
Via UVX
TOML configuration (recommended)
"mcpServers": {
"snowflake_production": {
"command": "uvx",
"args": [
"--python=3.13",
"--from", "mcp-snowflake-server-nsp",
"mcp_snowflake_server",
"--connections-file", "/path/to/snowflake_connections.toml",
"--connection-name", "production"
// Optional flags — see Configuration Reference
]
},
"snowflake_staging": {
"command": "uvx",
"args": [
"--python=3.13",
"--from", "mcp-snowflake-server-nsp",
"mcp_snowflake_server",
"--connections-file", "/path/to/snowflake_connections.toml",
"--connection-name", "staging"
]
}
}
Individual parameters
"mcpServers": {
"snowflake": {
"command": "uvx",
"args": [
"--python=3.13",
"--from", "mcp-snowflake-server-nsp",
"mcp_snowflake_server",
"--account", "your_account",
"--warehouse", "your_warehouse",
"--user", "your_user",
"--password", "your_password",
"--role", "your_role",
"--database", "your_database",
"--schema", "your_schema"
// Optional: "--private_key_file", "/absolute/path/key.p8"
// Optional: "--private_key_file_pwd", "passphrase"
// Optional flags — see Configuration Reference
]
}
}
Locally from Source with VSCode
-
Install Visual Studio Code
-
Install
uv:curl -LsSf https://astral.sh/uv/install.sh | sh -
Create a
.envfile with your Snowflake credentials (or use a TOML connection file — see Authentication):SNOWFLAKE_USER="[email protected]" SNOWFLAKE_ACCOUNT="myaccount" SNOWFLAKE_ROLE="MYROLE" SNOWFLAKE_DATABASE="MY_DB" SNOWFLAKE_SCHEMA="PUBLIC" SNOWFLAKE_WAREHOUSE="COMPUTE_WH" SNOWFLAKE_AUTHENTICATOR="snowflake" SNOWFLAKE_PASSWORD="secret" # Key-pair alternative: # SNOWFLAKE_AUTHENTICATOR="snowflake_jwt" # SNOWFLAKE_PRIVATE_KEY_FILE=/absolute/path/key.p8 # SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase" # Browser SSO alternative: # SNOWFLAKE_AUTHENTICATOR="externalbrowser" -
(Optional) Edit
runtime_config.jsonto exclude specific databases, schemas, or tables (see Exclusion Patterns). -
Test locally:
uv --directory /absolute/path/to/mcp_snowflake_server run mcp_snowflake_server -
Add to
.vscode/mcp.json:
TOML configuration (recommended)
"snowflake-local": {
"type": "stdio",
"command": "/absolute/path/to/uv",
"args": [
"--python=3.13",
"--directory", "/absolute/path/to/mcp_snowflake_server",
"run", "mcp_snowflake_server",
"--connections-file", "/absolute/path/to/snowflake_connections.toml",
"--connection-name", "development"
// Optional flags — see Configuration Reference
],
}
Environment variables
"snowflake-local": {
"type": "stdio",
"command": "/absolute/path/to/uv",
"args": [
"--python=3.13",
"--directory", "/absolute/path/to/mcp_snowflake_server",
"run", "mcp_snowflake_server",
// Optional flags — see Configuration Reference / .env.example file
],
"envFile": "/absolute/path/to/.env"
}
Locally from Source with Claude
-
Install Claude AI Desktop App
-
Install
uv:curl -LsSf https://astral.sh/uv/install.sh | sh -
Create a
.envfile with your Snowflake credentials (or use a TOML connection file — see Authentication):SNOWFLAKE_USER="[email protected]" SNOWFLAKE_ACCOUNT="myaccount" SNOWFLAKE_ROLE="MYROLE" SNOWFLAKE_DATABASE="MY_DB" SNOWFLAKE_SCHEMA="PUBLIC" SNOWFLAKE_WAREHOUSE="COMPUTE_WH" SNOWFLAKE_AUTHENTICATOR="snowflake" SNOWFLAKE_PASSWORD="secret" # Key-pair alternative: # SNOWFLAKE_AUTHENTICATOR="snowflake_jwt" # SNOWFLAKE_PRIVATE_KEY_FILE=/absolute/path/key.p8 # SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase" # Browser SSO alternative: # SNOWFLAKE_AUTHENTICATOR="externalbrowser" -
(Optional) Edit
runtime_config.jsonto exclude specific databases, schemas, or tables (see Exclusion Patterns). -
Test locally:
uv --directory /absolute/path/to/mcp_snowflake_server run mcp_snowflake_server -
Add to
claude_desktop_config.json:
TOML configuration (recommended)
"mcpServers": {
"snowflake_local": {
"command": "/absolute/path/to/uv",
"args": [
"--python=3.13",
"--directory", "/absolute/path/to/mcp_snowflake_server",
"run", "mcp_snowflake_server",
"--connections-file", "/absolute/path/to/snowflake_connections.toml",
"--connection-name", "development"
// Optional flags — see Configuration Reference
]
}
}
Environment variables
"mcpServers": {
"snowflake_local": {
"command": "/absolute/path/to/uv",
"args": [
"--python=3.13",
"--directory", "/absolute/path/to/mcp_snowflake_server",
"run", "mcp_snowflake_server"
// Optional flags — see Configuration Reference
]
}
}
Docker
A Dockerfile is included for containerised deployments:
# Build
docker build -t mcp-snowflake-server .
# Run (pass credentials as environment variables)
docker run --rm \
-e SNOWFLAKE_USER="[email protected]" \
-e SNOWFLAKE_ACCOUNT="myaccount" \
-e SNOWFLAKE_AUTHENTICATOR="snowflake" \
-e SNOWFLAKE_PASSWORD="secret" \
-e SNOWFLAKE_WAREHOUSE="COMPUTE_WH" \
-e SNOWFLAKE_DATABASE="MY_DB" \
-e SNOWFLAKE_SCHEMA="PUBLIC" \
-e SNOWFLAKE_ROLE="MYROLE" \
mcp-snowflake-server
# Or override the entrypoint arguments directly
docker run --rm mcp-snowflake-server \
--account your_account \
--user your_user \
--authenticator snowflake \
--password your_password \
--warehouse COMPUTE_WH \
--database MY_DB \
--schema PUBLIC \
--role MYROLE
Configuration Reference
All connection parameters can also be set as environment variables (SNOWFLAKE_<PARAM_UPPER>).
| Flag | Env var | Default | Description |
|---|---|---|---|
--account | SNOWFLAKE_ACCOUNT | — | Snowflake account identifier |
--user | SNOWFLAKE_USER | — | Snowflake username |
--password | SNOWFLAKE_PASSWORD | — | Password (not required for key-pair / SSO) |
--warehouse | SNOWFLAKE_WAREHOUSE | — | Virtual warehouse to use |
--database | SNOWFLAKE_DATABASE | (required) | Default database |
--schema | SNOWFLAKE_SCHEMA | (required) | Default schema |
--role | SNOWFLAKE_ROLE | — | Role to assume |
--private_key_file | SNOWFLAKE_PRIVATE_KEY_FILE | — | Absolute path to .p8 private key file |
--private_key_file_pwd | SNOWFLAKE_PRIVATE_KEY_FILE_PWD | — | Passphrase for encrypted private key |
--connections-file | — | — | Path to TOML connections file |
--connection-name | — | — | Connection profile name in TOML file (required with --connections-file) |
--allow_write | — | false | Enable write_query and create_table tools |
--prefetch / --no-prefetch | — | false | Pre-load table schema as context://table/* resources (disables list_tables / describe_table) |
--exclude_tools | — | [] | Space-separated list of tool names to disable |
--exclude-json-results | — | false | Omit embedded JSON resources from responses (reduces context window usage) |
--log_dir | — | — | Directory for log file output |
--log_level | — | INFO | Log verbosity: DEBUG, INFO, WARNING, ERROR, CRITICAL |
Exclusion Patterns
Edit runtime_config.json to exclude databases, schemas, or tables from all discovery tools. Patterns are matched case-insensitively as substrings.
{
"exclude_patterns": {
"databases": ["temp"],
"schemas": ["temp", "information_schema"],
"tables": ["temp"]
}
}
The server loads this file automatically at startup from the working directory.
Development
# Install dependencies (including dev tools)
make install
# Lint & auto-fix with Ruff
make ruff
# Run tests
make test
# Run tests with terminal coverage report
make coverage
# Run tests and open HTML coverage report
make coverage-html
# Run the server locally
make run
Requires uv. Dev dependencies include ruff, mypy, pytest, pytest-asyncio, pytest-cov, and pre-commit.
Documentation & Coverage
License
This project is licensed under the GNU General Public License v3.0. See the LICENSE file for the full text.
Fork and Attribution
This repository is a fork of isaacwasserman/mcp-snowflake-server.
- Upstream authors and contributors retain copyright for their contributions.
- Fork-specific changes are maintained by
nsphung. - A summary of notable modifications is tracked in
NOTICE.
関連サーバー
ODBC Server via PyODBC
An MCP server for connecting to databases like Virtuoso using ODBC drivers via pyodbc.
Snapchat Ads by CData
A read-only MCP server for querying live Snapchat Ads data using the CData JDBC Driver.
Isthmus
Local MCP server that connects AI models to any PostgreSQL database. Discover schemas, explore relationships, profile tables, and run read-only SQL queries, policy column masking,... all running locally
Kollect MCP Server (Standalone)
An MCP server for querying data from the Kollect tool's API endpoints, such as /api/data, /api/snapshots, and /api/costs.
mcp-database-server
Production-grade Model Context Protocol (MCP) server for unified SQL database access. Connect multiple databases through a single MCP server with schema discovery, relationship mapping, caching, and safety controls.
CouchDB MCP Server
A server for interacting with CouchDB databases.
Biomart MCP
Interface with Biomart, a biological data query tool, using the pybiomart Python package.
Apache Doris
MCP Server For Apache Doris, an MPP-based real-time data warehouse.
Model Database Protocol
Intent-based, secure database access protocol for AI systems — LLMs send structured intents instead of raw SQL.
RentCast
Access property data, valuations, and market statistics using the RentCast API.
