SingleStore
Interact with the SingleStore database platform
SingleStore MCP Server
Model Context Protocol (MCP) is a standardized protocol designed to manage context between large language models (LLMs) and external systems. This repository provides an installer and an MCP Server for Singlestore, enabling seamless integration.
With MCP, you can use Claude Desktop, Claude Code, Cursor, or any compatible MCP client to interact with SingleStore using natural language, making it easier to perform complex operations effortlessly.
💡 Pro Tip: Not sure what the MCP server can do? Just call the /help prompt in your chat!
Requirements
- Python >= v3.10.0
- uvx installed on your python environment
- VS Code, Cursor, Windsurf, Claude Desktop, Claude Code, Goose or any other MCP client
Getting started
Getting started
First, install the SingleStore MCP server with your client.
Standard config works in most of the tools:
{
"mcpServers": {
"singlestore-mcp-server": {
"command": "uvx",
"args": [
"singlestore-mcp-server",
"start"
]
}
}
}
No API keys, tokens, or environment variables required! The server automatically handles authentication via browser OAuth when started.
Automatic setup:
uvx singlestore-mcp-server init --client=claude-desktop
Manual setup: Follow the MCP install guide, use the standard config above.
Automatic setup:
uvx singlestore-mcp-server init --client=claude-code
This will automatically run the Claude CLI command for you.
Manual setup:
claude mcp add singlestore-mcp-server uvx singlestore-mcp-server start
Automatic setup:
uvx singlestore-mcp-server init --client=cursor
Manual setup:
Go to Cursor Settings -> MCP -> Add new MCP Server. Name to your liking, use command type with the command uvx singlestore-mcp-server start. You can also verify config or add command line arguments via clicking Edit.
Automatic setup:
uvx singlestore-mcp-server init --client=vscode
Manual setup: Follow the MCP install guide, use the standard config above. You can also install using the VS Code CLI:
code --add-mcp '{"name":"singlestore-mcp-server","command":"uvx","args":["singlestore-mcp-server","start"]}'
After installation, the SingleStore MCP server will be available for use with your GitHub Copilot agent in VS Code.
Automatic setup:
uvx singlestore-mcp-server init --client=windsurf
Manual setup: Follow Windsurf MCP documentation. Use the standard config above.
Automatic setup:
uvx singlestore-mcp-server init --client=gemini
Manual setup: Follow the MCP install guide, use the standard config above.
Automatic setup:
uvx singlestore-mcp-server init --client=lm-studio
Manual setup:
Go to Program in the right sidebar -> Install -> Edit mcp.json. Use the standard config above.
Manual setup only:
Go to Advanced settings -> Extensions -> Add custom extension. Name to your liking, use type STDIO, and set the command to uvx singlestore-mcp-server start. Click "Add Extension".
Manual setup only: Open Qodo Gen chat panel in VSCode or IntelliJ → Connect more tools → + Add new MCP → Paste the standard config above.
Click Save.
Using Docker
NOTE: An API key is required when using Docker because the OAuth flow isn't supported for servers running in Docker containers.
{
"mcpServers": {
"singlestore-mcp-server": {
"command": "docker",
"args": [
"run", "-i", "--rm", "--init", "--pull=always",
"-e", "MCP_API_KEY=your_api_key_here",
"singlestore/mcp-server-singlestore"
]
}
}
}
You can build the Docker image yourself:
docker build -t singlestore/mcp-server-singlestore .
For better security, we recommend using Docker Desktop to configure the SingleStore MCP server—see this blog post for details on Docker's new MCP Catalog.
Components
Tools
The server implements the following tools:
-
get_user_info: Retrieve details about the current user
- No arguments required
- Returns user information and details
-
organization_info: Retrieve details about the user's current organization
- No arguments required
- Returns details of the organization
-
choose_organization: Choose from available organizations (only available when API key environment variable is not set)
- No arguments required
- Returns a list of available organizations to choose from
-
set_organization: Set the active organization (only available when API key environment variable is not set)
- Arguments:
organization_id(string) - Sets the specified organization as active
- Arguments:
-
workspace_groups_info: Retrieve details about the workspace groups accessible to the user
- No arguments required
- Returns details of the workspace groups
-
workspaces_info: Retrieve details about the workspaces in a specific workspace group
- Arguments:
workspace_group_id(string) - Returns details of the workspaces
- Arguments:
-
resume_workspace: Resume a suspended workspace
- Arguments:
workspace_id(string) - Resumes the specified workspace
- Arguments:
-
list_starter_workspaces: List all starter workspaces accessible to the user
- No arguments required
- Returns details of available starter workspaces
-
create_starter_workspace: Create a new starter workspace
- Arguments: workspace configuration parameters
- Returns details of the created starter workspace
-
terminate_starter_workspace: Terminate an existing starter workspace
- Arguments:
workspace_id(string) - Terminates the specified starter workspace
- Arguments:
-
list_regions: Retrieve a list of all regions that support workspaces
- No arguments required
- Returns a list of available regions
-
list_sharedtier_regions: Retrieve a list of shared tier regions
- No arguments required
- Returns a list of shared tier regions
-
run_sql: Execute SQL operations on a connected workspace
- Arguments:
workspace_id,database,sql_query, and connection parameters - Returns the results of the SQL query in a structured format
- Arguments:
-
create_notebook_file: Create a new notebook file in SingleStore Spaces
- Arguments:
notebook_name,content(optional) - Returns details of the created notebook
- Arguments:
-
upload_notebook_file: Upload a notebook file to SingleStore Spaces
- Arguments:
file_path,notebook_name - Returns details of the uploaded notebook
- Arguments:
-
create_job_from_notebook: Create a scheduled job from a notebook
- Arguments: job configuration including
notebook_path,schedule_mode, etc. - Returns details of the created job
- Arguments: job configuration including
-
get_job: Retrieve details of an existing job
- Arguments:
job_id(string) - Returns details of the specified job
- Arguments:
-
delete_job: Delete an existing job
- Arguments:
job_id(string) - Deletes the specified job
- Arguments:
Note: Organization management tools (choose_organization and set_organization) are only available when the API key environment variable is not set, allowing for interactive organization selection during OAuth authentication.
Development
Prerequisites
- Python >= 3.11
- uv for dependency management
Setup
- Clone the repository:
git clone https://github.com/singlestore-labs/mcp-server-singlestore.git
cd mcp-server-singlestore
- Install dependencies:
uv sync --dev
- Set up pre-commit hooks (optional but recommended):
uv run pre-commit install
Development Workflow
# Quick quality checks (fast feedback)
./scripts/check.sh
# Run tests independently
./scripts/test.sh
# Comprehensive validation (before PRs)
./scripts/check-all.sh
# Create and publish releases
./scripts/release.sh
Running Tests
# Run test suite with coverage
./scripts/test.sh
# Or use pytest directly
uv run pytest
uv run pytest --cov=src --cov-report=html
Code Quality
We use Ruff for both linting and formatting:
# Format code
uv run ruff format src/ tests/
# Lint code
uv run ruff check src/ tests/
# Lint and fix issues automatically
uv run ruff check --fix src/ tests/
Release Process
Releases are managed through git tags and automated PyPI publication:
- Create release:
./scripts/release.sh(interactive tool) - Automatic publication: Triggered by pushing version tags
- No manual PyPI uploads - fully automated pipeline
See scripts/dev-workflow.md for detailed workflow documentation.
Related Servers
MCP Alchemy
Explore, query, and analyze SQLAlchemy-compatible databases directly from your desktop.
CData MySQL MCP Server
A read-only MCP server for MySQL, enabling LLMs to query live data using the CData JDBC Driver.
MySQL MCP Server
Provides AI agents with direct access to query, search, and analyze MySQL databases.
Unofficial Reactome MCP Server
Access Reactome pathway and systems biology data via its live API.
BigQuery
Server implementation for Google BigQuery integration that enables direct BigQuery database access and querying capabilities
Veeva MCP Server by CData
A read-only MCP server by CData that enables LLMs to query live data from Veeva.
dbt CLI
An MCP server that wraps the dbt CLI, allowing AI agents to interact with dbt projects.
Formula One MCP Server
Access Formula One data and statistics, including race calendars, session results, driver data, lap times, telemetry, and championship standings.
Highrise by CData
A read-only MCP server for Highrise, enabling LLMs to query live data using the CData JDBC Driver.
CRM MCP Server
A production-ready MCP server for Customer Relationship Management (CRM) functionality, built with TypeScript and SQLite.