MCP Snowflake Server NSP

A Snowflake MCP server — SQL queries, schema exploration, and data insights for AI assistants

PyPI PyPI Downloads MCP Compatible made-with-python python-3.13+ Ruff Checked with mypy codecov Ask DeepWiki

Snowflake MCP Server

A Model Context Protocol (MCP) server / MCP server that connects AI assistants to Snowflake — enabling SQL queries, schema exploration, and data insights directly from your LLM client.

Highlights:

  • Multiple authentication methods: password, key-pair, external browser, OAuth 2.0 (client credentials & bearer token), TOML connection files
  • TOML multi-connection config — manage production, staging, and development environments in one file
  • Write-safety guard — write operations are disabled by default and must be explicitly enabled
  • Exclusion patterns — filter out databases, schemas, or tables from discovery
  • --exclude-json-results flag — reduces LLM context window usage
  • Selective tool exclusion via --exclude_tools
  • Prefetch mode — pre-load table schema as MCP resources
  • Docker support

Table of Contents


Quick Start

The fastest way to try it — using uvx with a TOML connection file:

# 1. Create a connections file
cat > ~/snowflake_connections.toml << 'EOF'
[myconn]
account = "your_account"
user = "your_user"
password = "your_password"
warehouse = "COMPUTE_WH"
database = "MY_DB"
schema = "PUBLIC"
role = "MYROLE"
EOF

# 2. Run the server
uvx --python=3.13 --from mcp-snowflake-server-nsp mcp_snowflake_server \
  --connections-file ~/snowflake_connections.toml \
  --connection-name myconn

Claude Code

Add to your MCP client config (e.g. claude_desktop_config.json) using snowflake_connections.toml:

"mcpServers": {
  "snowflake": {
    "command": "uvx",
    "args": [
      "--python=3.13",
      "--from", "mcp-snowflake-server-nsp",
      "mcp_snowflake_server",
      "--connections-file", "/absolute/path/to/snowflake_connections.toml",
      "--connection-name", "myconn"
    ]
  }
}

Visual Studio Code (VSCode)

Add to your MCP client config (e.g. .vscode/mcp.json) using .env file (see Authentication):

"snowflake": {
      // Snowflake MCP server
      "type": "stdio",
      "command": "uvx",
      "args": [
        "--from", "mcp-snowflake-server-nsp",
        "--python=3.13",
        "mcp_snowflake_server"
      ],
      "envFile": "${workspaceFolder}/.env"
    }

OpenCode

Add to your MCP client config (e.g. opencode.jsonc) with .env file (see Authentication):

"snowflake": {
  "type": "local",
  "command": [
    "uvx",
    "--from",
    "mcp-snowflake-server-nsp",
    "--python=3.13",
    "mcp_snowflake_server",
  ],
  "enabled": true,
  "timeout": 300000,
}

Components

Resources

URIDescription
memo://insightsA continuously updated memo aggregating data insights appended via append_insight.
context://table/{table_name}(Prefetch mode only) Per-table schema summaries including columns and comments.

Tools

Query Tools

ToolDescriptionRequires
read_queryExecute SELECT queries. Input: query (string).
write_queryExecute INSERT, UPDATE, or DELETE queries. Input: query (string).--allow_write
create_tableExecute CREATE TABLE statements. Input: query (string).--allow_write

Schema Tools

ToolDescriptionInput
list_databasesList all databases in the Snowflake instance.
list_schemasList all schemas within a database.database (string)
list_tablesList all tables within a database and schema.database, schema (strings)
describe_tableDescribe columns of a table (name, type, nullability, default, comment).table_name as database.schema.table

Analysis Tools

ToolDescriptionInput
append_insightAdd a data insight to the memo://insights resource.insight (string)

Authentication

Password

Set credentials via environment variables or CLI flags (see Configuration Reference):

SNOWFLAKE_USER="[email protected]"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_AUTHENTICATOR="snowflake"
SNOWFLAKE_PASSWORD="secret"
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"

Key-Pair

SNOWFLAKE_USER="[email protected]"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_AUTHENTICATOR="snowflake_jwt"
SNOWFLAKE_PRIVATE_KEY_FILE="/absolute/path/to/key.p8"
SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase"  # Optional — only if key is encrypted
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"

Or via CLI: --private_key_file /path/to/key.p8 --private_key_file_pwd passphrase

External Browser

SNOWFLAKE_AUTHENTICATOR="externalbrowser"

Or in a TOML connection entry: authenticator = "externalbrowser"

OAuth 2.0 Client Credentials

Use the OAuth 2.0 client credentials flow to authenticate with a client ID and secret (no user interaction required):

SNOWFLAKE_AUTHENTICATOR="oauth_client_credentials"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_OAUTH_CLIENT_ID="your_client_id"
SNOWFLAKE_OAUTH_CLIENT_SECRET="your_client_secret"
SNOWFLAKE_OAUTH_TOKEN_REQUEST_URL="https://your-idp.example.com/oauth/token"
SNOWFLAKE_OAUTH_SCOPE="session:role:MY_ROLE"  # Optional
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"

OAuth Bearer Token

Use a pre-fetched OAuth bearer token:

SNOWFLAKE_AUTHENTICATOR="oauth"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_TOKEN="eyJhbGciOiJSUzI1NiJ9..."
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"

TOML Connection File (Recommended)

Manage multiple environments in a single file. See example_connections.toml for a full template.

[production]
account = "your_account"
user = "your_user"
password = "your_password"
authenticator = "snowflake"
warehouse = "COMPUTE_WH"
database = "PROD_DB"
schema = "PUBLIC"
role = "ACCOUNTADMIN"

[development]
account = "your_account"
user = "dev_user"
authenticator = "externalbrowser"
warehouse = "DEV_WH"
database = "DEV_DB"
schema = "PUBLIC"
role = "DEVELOPER"

[reporting]
account = "your_account"
user = "reporting_user"
authenticator = "snowflake_jwt"
private_key_file = "/path/to/private_key.pem"
private_key_file_pwd = "passphrase"  # Optional
warehouse = "REPORTING_WH"
database = "REPORTING_DB"
schema = "REPORTS"
role = "REPORTING_ROLE"

[analytics_oauth]
account = "your_account"
authenticator = "oauth_client_credentials"
oauth_client_id = "your_client_id"
oauth_client_secret = "your_client_secret"
oauth_token_request_url = "https://your-idp.example.com/oauth/token"
oauth_scope = "session:role:ANALYTICS_ROLE"  # Optional
warehouse = "ANALYTICS_WH"
database = "ANALYTICS_DB"
schema = "PUBLIC"
role = "ANALYTICS_ROLE"

Pass the file with --connections-file and select a profile with --connection-name. Both flags are required together.


Installation

The package is published on PyPI as mcp-snowflake-server-nsp.


Via UVX

TOML configuration (recommended)
"mcpServers": {
  "snowflake_production": {
    "command": "uvx",
    "args": [
      "--python=3.13",
      "--from", "mcp-snowflake-server-nsp",
      "mcp_snowflake_server",
      "--connections-file", "/path/to/snowflake_connections.toml",
      "--connection-name", "production"
      // Optional flags — see Configuration Reference
    ]
  },
  "snowflake_staging": {
    "command": "uvx",
    "args": [
      "--python=3.13",
      "--from", "mcp-snowflake-server-nsp",
      "mcp_snowflake_server",
      "--connections-file", "/path/to/snowflake_connections.toml",
      "--connection-name", "staging"
    ]
  }
}
Individual parameters
"mcpServers": {
  "snowflake": {
    "command": "uvx",
    "args": [
      "--python=3.13",
      "--from", "mcp-snowflake-server-nsp",
      "mcp_snowflake_server",
      "--account", "your_account",
      "--warehouse", "your_warehouse",
      "--user", "your_user",
      "--password", "your_password",
      "--role", "your_role",
      "--database", "your_database",
      "--schema", "your_schema"
      // Optional: "--private_key_file", "/absolute/path/key.p8"
      // Optional: "--private_key_file_pwd", "passphrase"
      // Optional flags — see Configuration Reference
    ]
  }
}

Locally from Source with VSCode

  • Install Visual Studio Code

  • Install uv:

    curl -LsSf https://astral.sh/uv/install.sh | sh
    
  • Create a .env file with your Snowflake credentials (or use a TOML connection file — see Authentication):

    SNOWFLAKE_USER="[email protected]"
    SNOWFLAKE_ACCOUNT="myaccount"
    SNOWFLAKE_ROLE="MYROLE"
    SNOWFLAKE_DATABASE="MY_DB"
    SNOWFLAKE_SCHEMA="PUBLIC"
    SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
    SNOWFLAKE_AUTHENTICATOR="snowflake"
    SNOWFLAKE_PASSWORD="secret"
    # Key-pair alternative:
    # SNOWFLAKE_AUTHENTICATOR="snowflake_jwt"
    # SNOWFLAKE_PRIVATE_KEY_FILE=/absolute/path/key.p8
    # SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase"
    # Browser SSO alternative:
    # SNOWFLAKE_AUTHENTICATOR="externalbrowser"
    
  • (Optional) Edit runtime_config.json to exclude specific databases, schemas, or tables (see Exclusion Patterns).

  • Test locally:

    uv --directory /absolute/path/to/mcp_snowflake_server run mcp_snowflake_server
    
  • Add to .vscode/mcp.json:

TOML configuration (recommended)
"snowflake-local": {
    "type": "stdio",
    "command": "/absolute/path/to/uv",
    "args": [
      "--python=3.13",
      "--directory", "/absolute/path/to/mcp_snowflake_server",
      "run", "mcp_snowflake_server",
      "--connections-file", "/absolute/path/to/snowflake_connections.toml",
      "--connection-name", "development"
      // Optional flags — see Configuration Reference
    ],
}
Environment variables
"snowflake-local": {
    "type": "stdio",
    "command": "/absolute/path/to/uv",
    "args": [
      "--python=3.13",
      "--directory", "/absolute/path/to/mcp_snowflake_server",
      "run", "mcp_snowflake_server",
      // Optional flags — see Configuration Reference / .env.example file
    ],
    "envFile": "/absolute/path/to/.env"
}

Locally from Source with Claude

  1. Install Claude AI Desktop App

  2. Install uv:

    curl -LsSf https://astral.sh/uv/install.sh | sh
    
  3. Create a .env file with your Snowflake credentials (or use a TOML connection file — see Authentication):

    SNOWFLAKE_USER="[email protected]"
    SNOWFLAKE_ACCOUNT="myaccount"
    SNOWFLAKE_ROLE="MYROLE"
    SNOWFLAKE_DATABASE="MY_DB"
    SNOWFLAKE_SCHEMA="PUBLIC"
    SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
    SNOWFLAKE_AUTHENTICATOR="snowflake"
    SNOWFLAKE_PASSWORD="secret"
    # Key-pair alternative:
    # SNOWFLAKE_AUTHENTICATOR="snowflake_jwt"
    # SNOWFLAKE_PRIVATE_KEY_FILE=/absolute/path/key.p8
    # SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase"
    # Browser SSO alternative:
    # SNOWFLAKE_AUTHENTICATOR="externalbrowser"
    
  4. (Optional) Edit runtime_config.json to exclude specific databases, schemas, or tables (see Exclusion Patterns).

  5. Test locally:

    uv --directory /absolute/path/to/mcp_snowflake_server run mcp_snowflake_server
    
  6. Add to claude_desktop_config.json:

TOML configuration (recommended)
"mcpServers": {
  "snowflake_local": {
    "command": "/absolute/path/to/uv",
    "args": [
      "--python=3.13",
      "--directory", "/absolute/path/to/mcp_snowflake_server",
      "run", "mcp_snowflake_server",
      "--connections-file", "/absolute/path/to/snowflake_connections.toml",
      "--connection-name", "development"
      // Optional flags — see Configuration Reference
    ]
  }
}
Environment variables
"mcpServers": {
  "snowflake_local": {
    "command": "/absolute/path/to/uv",
    "args": [
      "--python=3.13",
      "--directory", "/absolute/path/to/mcp_snowflake_server",
      "run", "mcp_snowflake_server"
      // Optional flags — see Configuration Reference
    ]
  }
}

Docker

A Dockerfile is included for containerised deployments:

# Build
docker build -t mcp-snowflake-server .

# Run (pass credentials as environment variables)
docker run --rm \
  -e SNOWFLAKE_USER="[email protected]" \
  -e SNOWFLAKE_ACCOUNT="myaccount" \
  -e SNOWFLAKE_AUTHENTICATOR="snowflake" \
  -e SNOWFLAKE_PASSWORD="secret" \
  -e SNOWFLAKE_WAREHOUSE="COMPUTE_WH" \
  -e SNOWFLAKE_DATABASE="MY_DB" \
  -e SNOWFLAKE_SCHEMA="PUBLIC" \
  -e SNOWFLAKE_ROLE="MYROLE" \
  mcp-snowflake-server

# Or override the entrypoint arguments directly
docker run --rm mcp-snowflake-server \
  --account your_account \
  --user your_user \
  --authenticator snowflake \
  --password your_password \
  --warehouse COMPUTE_WH \
  --database MY_DB \
  --schema PUBLIC \
  --role MYROLE

Configuration Reference

All connection parameters can also be set as environment variables (SNOWFLAKE_<PARAM_UPPER>).

FlagEnv varDefaultDescription
--accountSNOWFLAKE_ACCOUNTSnowflake account identifier
--userSNOWFLAKE_USERSnowflake username
--passwordSNOWFLAKE_PASSWORDPassword (not required for key-pair / SSO)
--warehouseSNOWFLAKE_WAREHOUSEVirtual warehouse to use
--databaseSNOWFLAKE_DATABASE(required)Default database
--schemaSNOWFLAKE_SCHEMA(required)Default schema
--roleSNOWFLAKE_ROLERole to assume
--private_key_fileSNOWFLAKE_PRIVATE_KEY_FILEAbsolute path to .p8 private key file
--private_key_file_pwdSNOWFLAKE_PRIVATE_KEY_FILE_PWDPassphrase for encrypted private key
--connections-filePath to TOML connections file
--connection-nameConnection profile name in TOML file (required with --connections-file)
--allow_writefalseEnable write_query and create_table tools
--prefetch / --no-prefetchfalsePre-load table schema as context://table/* resources (disables list_tables / describe_table)
--exclude_tools[]Space-separated list of tool names to disable
--exclude-json-resultsfalseOmit embedded JSON resources from responses (reduces context window usage)
--log_dirDirectory for log file output
--log_levelINFOLog verbosity: DEBUG, INFO, WARNING, ERROR, CRITICAL

Exclusion Patterns

Edit runtime_config.json to exclude databases, schemas, or tables from all discovery tools. Patterns are matched case-insensitively as substrings.

{
  "exclude_patterns": {
    "databases": ["temp"],
    "schemas": ["temp", "information_schema"],
    "tables": ["temp"]
  }
}

The server loads this file automatically at startup from the working directory.


Development

# Install dependencies (including dev tools)
make install

# Lint & auto-fix with Ruff
make ruff

# Run tests
make test

# Run tests with terminal coverage report
make coverage

# Run tests and open HTML coverage report
make coverage-html

# Run the server locally
make run

Requires uv. Dev dependencies include ruff, mypy, pytest, pytest-asyncio, pytest-cov, and pre-commit.


Documentation & Coverage

  • Full AI-generated documentation: Ask DeepWiki

  • Test coverage sunburst:

    Sunburst Test Coverage


License

This project is licensed under the GNU General Public License v3.0. See the LICENSE file for the full text.


Fork and Attribution

This repository is a fork of isaacwasserman/mcp-snowflake-server.

MseeP.ai Security Assessment Badge

  • Upstream authors and contributors retain copyright for their contributions.
  • Fork-specific changes are maintained by nsphung.
  • A summary of notable modifications is tracked in NOTICE.

Serveurs connexes

NotebookLM Web Importer

Importez des pages web et des vidéos YouTube dans NotebookLM en un clic. Utilisé par plus de 200 000 utilisateurs.

Installer l'extension Chrome