Exasol MCP
Exasol MCP server. Provides knowledge about the Exasol database to an LLM through the Model Context Protocol.
Exasol MCP Server
Provides an LLM access to the Exasol database via MCP tools. Includes the tools for reading the database metadata and executing data reading queries.
🚀 Features
- Collects the metadata.
- Enumerates the existing database objects, including schemas, tables, views, functions and UDF scripts.
- Provides a filtering mechanisms to use with object enumeration.
- Describes the database objects: for tables returns the list of columns and constraints; for functions and scripts - the list of input and output parameters.
- Enables keyword search of database objects.
- Executes provided SQL query.
The complete list of tools is formally described in the Tool List section of the user guide.
🔌️ Prerequisites
- Python >= 3.10
- MCP Client application, e.g. Claude Desktop
💾 Installation
Ensure the uv package is installed. If uncertain call
uv --version
To install uv on macOS please use brew, i.e.
brew install uv
For other operating systems, please follow the instructions in the uv official documentation.
🧠 Using the server with the Claude Desktop
To enable the Claude Desktop using the Exasol MCP server, the latter must be listed in the configuration file claude_desktop_config.json. A similar configuration file would exist for most other MCP Client applications.
To find the Claude Desktop configuration file, click on the Settings and navigate to the “Developer” tab. This section contains options for configuring MCP servers and other developer features. Click the “Edit Config” button to open the configuration file in the editor of your choice.
Add the Exasol MCP server to the list of MCP servers as shown in this configuration example.
{ "mcpServers": { "exasol_db": { "command": "uvx", "args": ["exasol-mcp-server@latest"], "env": { "EXA_DSN": "my-dsn", "EXA_USER": "my-user-name", "EXA_PASSWORD": "my-password" } } } }
With these settings, uv will execute the latest version of the exasol-mcp-server in an ephemeral environment, without installing it.
Alternatively, the exasol-mcp-server can be installed using the command:
uv tool install exasol-mcp-server@latest
For further details on installing and upgrading the server using uv see the uv Tools documentation.
If the server is installed, the Claude configuration file should look like this:
{ "mcpServers": { "exasol_db": { "command": "exasol-mcp-server", "env": "same as above" } } }
Please note that any changes to the Claude configuration file will only take effect after restarting Claude Desktop.
🟠 🟢 Running modes
The MCP server can be deployed either locally, as described above, or as a remote HTTP server. To run the server as a Direct HTTP Server execute the command:
exasol-mcp-server-http --host --port
The host defaults to 0.0.0.0.
This command provides a simple way to verify the setup for a remote MCP Server deployment. For the production environment, one might consider using an ASGI server like Unicorn. The most flexible approach is implementing a wrapper for the Exasol MCP server that will provide the desired control options. For further information and ideas, please check the HTTP Deployment in the FastMCP documentation.
Here is an example code creating the Exasol MCP server from a wrapper.
from exasol.ai.mcp.server import mcp_server
exasol_mcp = mcp_server()
🔧 Configuration settings
The server is configured using environment variables and optionally a json file. In the above example, the server is provided with the database connection parameters, all other settings left to default. For the information on how to customize server settings please see the Server Setup in the User Guide.
📜 License
This project is licensed under the MIT License - see the LICENSE file for details.
Safe Harbor Statement: Exasol MCP Server & AI Solutions
Exasol’s AI solutions (including MCP Server) are designed to enable intelligent, autonomous, and highly performant access to data through AI and LLM-powered agents. While these technologies unlock powerful new capabilities, they also introduce potentially significant risks.
By granting AI agents access to your database, you acknowledge that the behavior of large language models (LLMs) and autonomous agents cannot be fully predicted or controlled. These systems may exhibit unintended or unsafe behavior—including but not limited to hallucinations, susceptibility to adversarial prompts, and the execution of unforeseen actions. Such behavior may result in data leakage, unauthorized data generation, or even data modification or deletion.
Exasol provides the tools to build AI-native workflows; however, you, as the implementer and system owner, assume full responsibility for managing these solutions within your environment. This includes establishing appropriate governance, authorization controls, sandboxing mechanisms, and operational guardrails to mitigate risks to your organization, your customers, and their data.
📚 Documentation
For further details, check out the latest documentation.
İlgili Sunucular
Apache AGE MCP Server
A server for Apache AGE, a graph database extension for PostgreSQL.
Azure Data Explorer
An MCP server for integrating with Azure Data Explorer, allowing for data querying and management.
Binance Cryptocurrency MCP
Access real-time Binance cryptocurrency market data, including prices, order books, and trading history.
Data Pilot (Snowflake)
A comprehensive Model Context Protocol (MCP) server for interacting with Snowflake using natural language and AI.
CData Adobe Analytics
A read-only MCP server to query live Adobe Analytics data. Requires the CData JDBC Driver for Adobe Analytics.
Metabase MCP Server
Interact with Metabase, the open-source business intelligence platform, using Large Language Models.
Apache Gravitino
Access Apache Gravitino, a high-performance, federated metadata lake for data and AI.
dbt CLI
An MCP server that wraps the dbt CLI, allowing AI agents to interact with dbt projects.
Unofficial Reactome MCP Server
Access Reactome pathway and systems biology data via its live API.
MemFlow MCP
Enables Large Language Models to store and retrieve persistent memories with intelligent search capabilities.