Open source MCP server specializing in easy, fast, and secure tools for Databases.
[!NOTE] MCP Toolbox for Databases is currently in beta, and may see breaking changes until the first stable release (v1.0).
MCP Toolbox for Databases is an open source MCP server for databases. It enables you to develop tools easier, faster, and more securely by handling the complexities such as connection pooling, authentication, and more.
This README provides a brief overview. For comprehensive details, see the full documentation.
[!NOTE] This solution was originally named “Gen AI Toolbox for Databases” as its initial development predated MCP, but was renamed to align with recently added MCP compatibility.
Toolbox helps you build Gen AI tools that let your agents access data in your database. Toolbox provides:
⚡ Supercharge Your Workflow with an AI Database Assistant ⚡
Stop context-switching and let your AI assistant become a true co-developer. By connecting your IDE to your databases with MCP Toolbox, you can delegate complex and time-consuming database tasks, allowing you to build faster and focus on what matters. This isn't just about code completion; it's about giving your AI the context it needs to handle the entire development lifecycle.
Here’s how it will save you time:
Learn how to connect your AI tools (IDEs) to Toolbox using MCP.
Toolbox sits between your application's orchestration framework and your database, providing a control plane that is used to modify, distribute, or invoke tools. It simplifies the management of your tools by providing you with a centralized location to store and update tools, allowing you to share tools between agents and applications and update those tools without necessarily redeploying your application.
For the latest version, check the releases page and use the following instructions for your OS and CPU architecture.
To install Toolbox as a binary:
# see releases page for other versions
export VERSION=0.8.0
curl -O https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox
# see releases page for other versions
export VERSION=0.8.0
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION
To install from source, ensure you have the latest version of Go installed, and then run the following command:
go install github.com/googleapis/genai-toolbox@v0.8.0
Configure a tools.yaml
to define your tools, and then
execute toolbox
to start the server:
./toolbox --tools-file "tools.yaml"
[!NOTE] Toolbox enables dynamic reloading by default. To disable, use the
--disable-reload
flag.
You can use toolbox help
for a full list of flags! To stop the server, send a
terminate signal (ctrl+c
on most platforms).
For more detailed documentation on deploying to different environments, check out the resources in the How-to section
Once your server is up and running, you can load the tools into your application. See below the list of Client SDKs for using various frameworks:
Install Toolbox Core SDK:
pip install toolbox-core
Load tools:
from toolbox_core import ToolboxClient
# update the url to point to your server
async with ToolboxClient("http://127.0.0.1:5000") as client:
# these tools can be passed to your application!
tools = await client.load_toolset("toolset_name")
For more detailed instructions on using the Toolbox Core SDK, see the project's README.
Install Toolbox LangChain SDK:
pip install toolbox-langchain
Load tools:
from toolbox_langchain import ToolboxClient
# update the url to point to your server
async with ToolboxClient("http://127.0.0.1:5000") as client:
# these tools can be passed to your application!
tools = client.load_toolset()
For more detailed instructions on using the Toolbox LangChain SDK, see the project's README.
Install Toolbox Llamaindex SDK:
pip install toolbox-llamaindex
Load tools:
from toolbox_llamaindex import ToolboxClient
# update the url to point to your server
async with ToolboxClient("http://127.0.0.1:5000") as client:
# these tools can be passed to your application!
tools = client.load_toolset()
For more detailed instructions on using the Toolbox Llamaindex SDK, see the project's README.
Install Toolbox Core SDK:
npm install @toolbox-sdk/core
Load tools:
import { ToolboxClient } from '@toolbox-sdk/core';
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const tools = await client.loadToolset('toolsetName');
For more detailed instructions on using the Toolbox Core SDK, see the project's README.
Install Toolbox Core SDK:
npm install @toolbox-sdk/core
Load tools:
import { ToolboxClient } from '@toolbox-sdk/core';
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const toolboxTools = await client.loadToolset('toolsetName');
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => tool(currTool, {
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
});
// Use these tools in your Langchain/Langraph applications
const tools = toolboxTools.map(getTool);
Install Toolbox Core SDK:
npm install @toolbox-sdk/core
Load tools:
import { ToolboxClient } from '@toolbox-sdk/core';
import { genkit } from 'genkit';
// Initialise genkit
const ai = genkit({
plugins: [
googleAI({
apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
})
],
model: googleAI.model('gemini-2.0-flash'),
});
// update the url to point to your server
const URL = 'http://127.0.0.1:5000';
let client = new ToolboxClient(URL);
// these tools can be passed to your application!
const toolboxTools = await client.loadToolset('toolsetName');
// Define the basics of the tool: name, description, schema and core logic
const getTool = (toolboxTool) => ai.defineTool({
name: toolboxTool.getName(),
description: toolboxTool.getDescription(),
schema: toolboxTool.getParamSchema()
}, toolboxTool)
// Use these tools in your Genkit applications
const tools = toolboxTools.map(getTool);
The primary way to configure Toolbox is through the tools.yaml
file. If you
have multiple files, you can tell toolbox which to load with the --tools-file tools.yaml
flag.
You can find more detailed reference documentation to all resource types in the Resources.
The sources
section of your tools.yaml
defines what data sources your
Toolbox should have access to. Most tools will have at least one source to
execute against.
sources:
my-pg-source:
kind: postgres
host: 127.0.0.1
port: 5432
database: toolbox_db
user: toolbox_user
password: my-password
For more details on configuring different types of sources, see the Sources.
The tools
section of a tools.yaml
define the actions an agent can take: what
kind of tool it is, which source(s) it affects, what parameters it uses, etc.
tools:
search-hotels-by-name:
kind: postgres-sql
source: my-pg-source
description: Search for hotels based on name.
parameters:
- name: name
type: string
description: The name of the hotel.
statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';
For more details on configuring different types of tools, see the Tools.
The toolsets
section of your tools.yaml
allows you to define groups of tools
that you want to be able to load together. This can be useful for defining
different groups based on agent or application.
toolsets:
my_first_toolset:
- my_first_tool
- my_second_tool
my_second_toolset:
- my_second_tool
- my_third_tool
You can load toolsets by name:
# This will load all tools
all_tools = client.load_toolset()
# This will only load the tools listed in 'my_second_toolset'
my_second_toolset = client.load_toolset("my_second_toolset")
This project uses semantic versioning, including a
MAJOR.MINOR.PATCH
version number that increments with:
The public API that this applies to is the CLI associated with Toolbox, the
interactions with official SDKs, and the definitions in the tools.yaml
file.
Contributions are welcome. Please, see the CONTRIBUTING to get started.
Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms. See Contributor Code of Conduct for more information.
Snowflake database integration with read/write capabilities and insight tracking
A read-only MCP server for querying live data from Square using the CData JDBC Driver.
Allows Claude AI to interact directly with MySQL databases.
Interact with Microsoft SQL Server (MSSQL) databases. List tables, read data, and execute SQL queries with controlled access.
A read-only MCP server by CData that enables LLMs to query live data from Exact Online.
Enables persistent knowledge storage for Claude using a knowledge graph with multiple database backends like PostgreSQL and SQLite.
Build robust data workflows, integrations, and analytics on a single intuitive platform.
Read and write access to your Baserow tables.
Read and write access to Airtable databases.
Interact with Neon Postgres databases using natural language to manage projects, branches, queries, and migrations via the Neon API.