SingleStore
官方Interact with the SingleStore database platform
SingleStore MCP Server
Model Context Protocol (MCP) is a standardized protocol designed to manage context between large language models (LLMs) and external systems. This repository provides an installer and an MCP Server for Singlestore, enabling seamless integration.
With MCP, you can use Claude Desktop, Claude Code, Cursor, or any compatible MCP client to interact with SingleStore using natural language, making it easier to perform complex operations effortlessly.
💡 Pro Tip: Not sure what the MCP server can do? Just call the /help prompt in your chat!
Requirements
- Python >= v3.10.0
- uvx installed on your python environment
- VS Code, Cursor, Windsurf, Claude Desktop, Claude Code, Goose or any other MCP client
Getting started
Getting started
First, install the SingleStore MCP server with your client.
Standard config works in most of the tools:
{
"mcpServers": {
"singlestore-mcp-server": {
"command": "uvx",
"args": [
"singlestore-mcp-server",
"start"
]
}
}
}
No API keys, tokens, or environment variables required! The server automatically handles authentication via browser OAuth when started.
Claude Desktop
Automatic setup:
uvx singlestore-mcp-server init --client=claude-desktop
Manual setup: Follow the MCP install guide, use the standard config above.
Claude Code
Automatic setup:
uvx singlestore-mcp-server init --client=claude-code
This will automatically run the Claude CLI command for you.
Manual setup:
claude mcp add singlestore-mcp-server uvx singlestore-mcp-server start
Cursor
Automatic setup:
uvx singlestore-mcp-server init --client=cursor
Manual setup:
Go to Cursor Settings -> MCP -> Add new MCP Server. Name to your liking, use command type with the command uvx singlestore-mcp-server start. You can also verify config or add command line arguments via clicking Edit.
VS Code
Automatic setup:
uvx singlestore-mcp-server init --client=vscode
Manual setup: Follow the MCP install guide, use the standard config above. You can also install using the VS Code CLI:
code --add-mcp '{"name":"singlestore-mcp-server","command":"uvx","args":["singlestore-mcp-server","start"]}'
After installation, the SingleStore MCP server will be available for use with your GitHub Copilot agent in VS Code.
Windsurf
Automatic setup:
uvx singlestore-mcp-server init --client=windsurf
Manual setup: Follow Windsurf MCP documentation. Use the standard config above.
Gemini CLI
Automatic setup:
uvx singlestore-mcp-server init --client=gemini
Manual setup: Follow the MCP install guide, use the standard config above.
LM Studio
Automatic setup:
uvx singlestore-mcp-server init --client=lm-studio
Manual setup:
Go to Program in the right sidebar -> Install -> Edit mcp.json. Use the standard config above.
Goose
Manual setup only:
Go to Advanced settings -> Extensions -> Add custom extension. Name to your liking, use type STDIO, and set the command to uvx singlestore-mcp-server start. Click "Add Extension".
Qodo Gen
Manual setup only: Open Qodo Gen chat panel in VSCode or IntelliJ → Connect more tools → + Add new MCP → Paste the standard config above.
Click Save.
Using Docker
NOTE: An API key is required when using Docker because the OAuth flow isn't supported for servers running in Docker containers.
{
"mcpServers": {
"singlestore-mcp-server": {
"command": "docker",
"args": [
"run", "-i", "--rm", "--init", "--pull=always",
"-e", "MCP_API_KEY=your_api_key_here",
"singlestore/mcp-server-singlestore"
]
}
}
}
You can build the Docker image yourself:
docker build -t singlestore/mcp-server-singlestore .
For better security, we recommend using Docker Desktop to configure the SingleStore MCP server—see this blog post for details on Docker's new MCP Catalog.
Components
Tools
The server implements the following tools:
-
get_user_info: Retrieve details about the current user
- No arguments required
- Returns user information and details
-
organization_info: Retrieve details about the user's current organization
- No arguments required
- Returns details of the organization
-
choose_organization: Choose from available organizations (only available when API key environment variable is not set)
- No arguments required
- Returns a list of available organizations to choose from
-
set_organization: Set the active organization (only available when API key environment variable is not set)
- Arguments:
organization_id(string) - Sets the specified organization as active
- Arguments:
-
workspace_groups_info: Retrieve details about the workspace groups accessible to the user
- No arguments required
- Returns details of the workspace groups
-
workspaces_info: Retrieve details about the workspaces in a specific workspace group
- Arguments:
workspace_group_id(string) - Returns details of the workspaces
- Arguments:
-
resume_workspace: Resume a suspended workspace
- Arguments:
workspace_id(string) - Resumes the specified workspace
- Arguments:
-
list_starter_workspaces: List all starter workspaces accessible to the user
- No arguments required
- Returns details of available starter workspaces
-
create_starter_workspace: Create a new starter workspace
- Arguments: workspace configuration parameters
- Returns details of the created starter workspace
-
terminate_starter_workspace: Terminate an existing starter workspace
- Arguments:
workspace_id(string) - Terminates the specified starter workspace
- Arguments:
-
list_regions: Retrieve a list of all regions that support workspaces
- No arguments required
- Returns a list of available regions
-
list_sharedtier_regions: Retrieve a list of shared tier regions
- No arguments required
- Returns a list of shared tier regions
-
run_sql: Execute SQL operations on a connected workspace
- Arguments:
workspace_id,database,sql_query, and connection parameters - Returns the results of the SQL query in a structured format
- Arguments:
-
create_notebook_file: Create a new notebook file in SingleStore Spaces
- Arguments:
notebook_name,content(optional) - Returns details of the created notebook
- Arguments:
-
upload_notebook_file: Upload a notebook file to SingleStore Spaces
- Arguments:
file_path,notebook_name - Returns details of the uploaded notebook
- Arguments:
-
create_job_from_notebook: Create a scheduled job from a notebook
- Arguments: job configuration including
notebook_path,schedule_mode, etc. - Returns details of the created job
- Arguments: job configuration including
-
get_job: Retrieve details of an existing job
- Arguments:
job_id(string) - Returns details of the specified job
- Arguments:
-
delete_job: Delete an existing job
- Arguments:
job_id(string) - Deletes the specified job
- Arguments:
-
stage_list_files: List files and folders in a Stage deployment's file system
- Arguments:
deployment_id(string),path(string, optional) - Returns folder contents including files and subfolders
- Arguments:
-
stage_get_file: Get a file from Stage by path
- Arguments:
deployment_id(string),path(string),return_type(string: 'metadata', 'url', or 'content') - Returns file metadata, a download URL, or text content
- Arguments:
-
stage_create_folder: Create a folder in Stage
- Arguments:
deployment_id(string),path(string) - Returns creation status
- Arguments:
-
stage_upload_file: Upload a file to Stage with text content
- Arguments:
deployment_id(string),path(string),content(string),local_path(string) - Returns upload status
- Arguments:
-
stage_move: Move or rename a file or folder in Stage
- Arguments:
deployment_id(string),source_path(string),destination_path(string) - Returns move status
- Arguments:
-
stage_delete: Delete a file or folder from Stage
- Arguments:
deployment_id(string),path(string) - Returns deletion status
- Arguments:
Note: Organization management tools (choose_organization and set_organization) are only available when the API key environment variable is not set, allowing for interactive organization selection during OAuth authentication.
Development
Prerequisites
- Python >= 3.11
- uv for dependency management
Setup
- Clone the repository:
git clone https://github.com/singlestore-labs/mcp-server-singlestore.git
cd mcp-server-singlestore
- Install dependencies:
uv sync --dev
- Set up pre-commit hooks (optional but recommended):
uv run pre-commit install
Development Workflow
# Quick quality checks (fast feedback)
./scripts/check.sh
# Run tests independently
./scripts/test.sh
# Comprehensive validation (before PRs)
./scripts/check-all.sh
# Create and publish releases
./scripts/release.sh
Running Tests
# Run test suite with coverage
./scripts/test.sh
# Or use pytest directly
uv run pytest
uv run pytest --cov=src --cov-report=html
Code Quality
We use Ruff for both linting and formatting:
# Format code
uv run ruff format src/ tests/
# Lint code
uv run ruff check src/ tests/
# Lint and fix issues automatically
uv run ruff check --fix src/ tests/
Release Process
Releases are managed through git tags and automated PyPI publication:
- Create release:
./scripts/release.sh(interactive tool) - Automatic publication: Triggered by pushing version tags
- No manual PyPI uploads - fully automated pipeline
See scripts/dev-workflow.md for detailed workflow documentation.
相关服务器
Directus MCP Server
An MCP server for Directus CMS, enabling AI clients to interact with the Directus API.
Superset MCP Server
Manage Apache Superset datasets, metrics, and SQL queries.
Chroma MCP Server
An MCP server for the Chroma embedding database, providing persistent, searchable working memory for AI-assisted development with features like automated context recall and codebase indexing.
Amazon Neptune
Query Amazon Neptune databases using openCypher, Gremlin, and SPARQL. Supports both Neptune Database and Neptune Analytics.
Drug Gene Interaction Database (DGIdb)
A bridge to the Drug Gene Interaction Database (DGIdb) API, enabling AI clients to query drug-gene interaction data.
MCP Postgres Query Server
An MCP server for querying a PostgreSQL database in read-only mode.
Instructure DAP
Query Canvas and other Instructure data using the Instructure Data Access Platform (DAP) API.
Library MCP
A local server to query and interact with Markdown knowledge bases by tags, text, slug, or date.
MongoDB Atlas
A server for managing data in MongoDB Atlas, providing secure and scalable data management through RESTful APIs.
MySQL Server
Provides read-only access to MySQL databases, allowing LLMs to inspect schemas and execute queries.