Snowflake Stored Procedure Integration
Integrates and executes Snowflake stored procedures through an MCP server.
Snowflake Stored Procedure Integration with MCP Server
This project simplifies the integration of Snowflake stored procedures with an MCP (Model Context Protocol) server. It provides a framework to define, manage, and execute Snowflake procedures from the MCP server environment.
Requirements
- Python 3.8 or higher
uvinstalled
Installation
- Clone the repository:
git clone https://github.com/Cjaimesg/snowflake-mcp-sp-integration.git
cd snowflake-mcp-sp-integration
- Install dependencies using
uv:
uv venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
Configuration
- Create a
.envfile in the root directory (do not commit this file). Define all required Snowflake credentials:
SNOWFLAKE_ACCOUNT=<xxx>-<xxx>
SNOWFLAKE_USER=<xxx>
SNOWFLAKE_PASSWORD=<xxx>
SNOWFLAKE_ROLE=<xxx>
SNOWFLAKE_WAREHOUSE=<xxx>
SNOWFLAKE_DATABASE=<xxx>
SNOWFLAKE_SCHEMA=<xxx>
SNOWFLAKE_HOST=<xxx>-<xxx>.snowflakecomputing.com
All variables are mandatory and must be defined by the user.
- Configure the project as an MCP server. In your MCP configuration file, add an entry like the following, adjusting the paths to match your local environment. Both --schemas and --procedures are optional and default to empty lists.
- Schemas: All store procedures in the specified schemas will be available to the MCP server.
- Procedures: Only the specified procedures will be available to the MCP server.
{
"mcpServers": {
"snowflake_sp_server": {
"command": "absolute/path/to/your/uv/uv.exe",
"args": [
"--directory",
"absolute/path/to/snowflake-mcp-sp-integration",
"run",
"main.py",
"--schemas",
"DB_NAME.SCHEMA_NAME",
"--procedures",
"DB_NAME.OTHER_SCHEMA_NAME.PROCEDURE_NAME"
]
}
}
}
Replace
absolute/path/to/snowflake-mcp-sp-integrationwith the actual directory where you cloned this repository.
Quickstart - Setup Guide for Dev Environment
The quickstart/ folder contains a step-by-step guide for setting up and testing this module in the Dev Environment.
License
This project is licensed under the MIT License.
Servidores relacionados
CData Adobe Analytics
A read-only MCP server to query live Adobe Analytics data. Requires the CData JDBC Driver for Adobe Analytics.
ChatQL MCP Server
Query SQL Server databases using natural language with OpenAI GPT models.
Nimiq MCP Server
An MCP server for read-only interaction with the Nimiq blockchain.
CData Salesforce MCP Server
A read-only MCP server by CData that allows LLMs to query live Salesforce data. Requires the CData JDBC Driver for Salesforce.
FoodData Central
Access the USDA's FoodData Central database for comprehensive food and nutrient information.
ClickHouse
An MCP server for interacting with a ClickHouse database.
Dremio
Integrate Large Language Models (LLMs) with Dremio, a data lakehouse platform.
RootData MCP Server
Query cryptocurrency and blockchain project data from the RootData API.
CData CSV Files
A read-only MCP server for CSV files from CData, requiring an external JDBC driver for connection.
Epitome
Personal AI memory — gives every AI agent shared, persistent memory of you