MCP server for SQLite files. Supports Datasette-compatible metadata!
Provide useful data to AI agents without giving them access to external systems. Compatible with Datasette for human users!
sqlite_get_catalog
.
sqlite_execute_main_{tool name}
.sqlite_execute
.titanic.yml
for your dataset:
databases:
titanic:
tables:
Observation:
description: Main table connecting passenger attributes to observed outcomes.
columns:
survived: "0/1 indicator whether the passenger survived."
age: The passenger's age at the time of the crash.
# Other columns are not documented but are still visible to the AI agent
queries:
get_survivors_of_age:
title: Count survivors of a specific age
description: Returns the total counts of passengers and survivors, both for all ages and for a specific provided age.
sql: |-
select
count(*) as total_passengers,
sum(survived) as survived_passengers,
sum(case when age = :age then 1 else 0 end) as total_specific_age,
sum(case when age = :age and survived = 1 then 1 else 0 end) as survived_specific_age
from Observation
{
"mcpServers": {
"sqlite": {
"command": "uvx",
"args": [
"mcp-sqlite",
"/absolute/path/to/titanic.db",
"--metadata",
"/absolute/path/to/titanic.yml"
]
}
}
}
Your AI agent should now be able to use mcp-sqlite tools sqlite_get_catalog
, sqlite_execute
, and get_survivors_of_age
!
The same database and metadata files can be used to explore the data interactively with MCP Inspector and Datasette.
MCP Inspector | Datasette |
---|---|
![]() | ![]() |
![]() | ![]() |
Use the MCP Inspector dashboard to interact with the SQLite database the same way that an AI agent would:
npx @modelcontextprotocol/inspector uvx mcp-sqlite path/to/titanic.db --metadata path/to/titanic.yml
Since mcp-sqlite
metadata is compatible with the Datasette metadata file, you can also explore your data with Datasette:
uvx datasette serve path/to/titanic.db --metadata path/to/titanic.yml
Compatibility with Datasette allows both AI agents and humans to easily explore the same local data!
mcp-sqlite
, this was a resource instead of a tool, but resources are not as widely supported, so it got turned into a tool.
If you have a usecase for the catalog as a resource, open an issue and we'll bring it back!usage: mcp-sqlite [-h] -m METADATA [-p PREFIX] [-v] sqlite_file
CLI command to start an MCP server for interacting with SQLite data.
positional arguments:
sqlite_file Path to SQLite file to serve the MCP server for.
options:
-h, --help show this help message and exit
-m METADATA, --metadata METADATA
Path to Datasette-compatible metadata YAML or JSON file.
-p PREFIX, --prefix PREFIX
Prefix for MCP tools. Defaults to no prefix.
-v, --verbose Be verbose. Include once for INFO output, twice for DEBUG output.
Hiding a table with hidden: true
will hide it from the catalog returned by the MCP tool sqlite_get_catalog()
.
However, note that the table will still be accessible by the AI agent!
Never rely on hiding a table from the catalog as a security feature.
Canned queries are each turned into a separate callable MCP tool by mcp-sqlite.
For example, a query named my_canned_query
will become a tool my_canned_query
.
The canned queries functionality is still in active development with more features planned for development soon:
Datasette query feature | Supported in mcp-sqlite? |
---|---|
Displayed in catalog | ✅ |
Executable | ✅ |
Titles | ✅ |
Descriptions | ✅ |
Parameters | ✅ |
Explicit parameters | ❌ (planned) |
Hide SQL | ✅ |
Write restrictions on canned queries | ✅ |
Pagination | ❌ (planned) |
Cross-database queries | ❌ (planned) |
Fragments | ❌ (not planned) |
Magic parameters | ❌ (not planned) |
An MCP server that provides tools to interact with Powerdrill datasets, enabling smart AI data analysis and insights.
A server for managing PostgreSQL databases, enabling comprehensive database operations.
A comprehensive movie database server supporting advanced search, CRUD operations, and image management via a PostgreSQL database.
A desktop application for managing and interacting with the MCP Memory Service, a semantic memory system built on the Model Context Protocol.
A server for full integration with Oracle Database. Requires Oracle Instant Client libraries.
Exposes Covalent's GoldRush blockchain data APIs as MCP resources and tools.
Access the OpenGenes database for aging and longevity research, with automatic updates from Hugging Face Hub.
A read-only MCP server for querying live Acumatica data using the CData JDBC Driver.
MCP server for dbt-core (OSS) users as the official dbt MCP only supports dbt Cloud. Supports project metadata, model and column-level lineage and dbt documentation.
Access Dungeons & Dragons 5th Edition content, including spells, classes, and monsters, via the Open5e API.