MCP servers for Deephaven to orchestrate data workers and power documentation Q&A with LLMs, enabling AI-driven data workflows.
deephaven_mcp.json
Deephaven MCP, which implements the Model Context Protocol (MCP) standard, provides tools to orchestrate, inspect, and interact with Deephaven Community Core servers, and to access conversational documentation via LLM-powered Docs Servers. It's designed for data scientists, engineers, and anyone looking to leverage Deephaven's capabilities through programmatic interfaces or integrated LLM tools.
Manages and connects to multiple Deephaven Community Core worker nodes and Deephaven Enterprise systems. This allows for unified control and interaction with your Deephaven instances from various client applications.
Key Capabilities:
Provides access to an LLM-powered conversational Q&A interface for Deephaven documentation. Get answers to your Deephaven questions in natural language.
graph TD
A["MCP Clients (Claude Desktop, etc.)"] --"stdio (MCP)"--> B("MCP Systems Server")
B --"Manages"--> C("Deephaven Community Core Worker 1")
B --"Manages"--> D("Deephaven Community Core Worker N")
B --"Manages"--> E("Deephaven Enterprise System 1")
B --"Manages"--> F("Deephaven Enterprise System N")
E --"Manages"--> G("Enterprise Worker 1.1")
E --"Manages"--> H("Enterprise Worker 1.N")
F --"Manages"--> I("Enterprise Worker N.1")
F --"Manages"--> J("Enterprise Worker N.N")
Clients connect to the MCP Systems Server, which in turn manages and communicates with Deephaven Community Core workers and Deephaven Enterprise systems.
graph TD
A["MCP Clients with streamable-http support"] --"streamable-http (direct)"--> B("MCP Docs Server")
C["MCP Clients without streamable-http support"] --"stdio"--> D["mcp-proxy"]
D --"streamable-http"--> B
B --"Accesses"--> E["Deephaven Documentation Corpus via Inkeep API"]
Modern MCP clients can connect directly via streamable-http for optimal performance. Clients without native streamable-http support can use mcp-proxy
to bridge stdio to streamable-http.
uv
(Recommended): A very fast Python package installer and resolver. If you don't have it, you can install it via pip install uv
or see the uv installation guide.venv
and pip
: Uses Python's built-in virtual environment (venv
) tools and pip
.The recommended way to install deephaven-mcp
is from PyPI. This provides the latest stable release and is suitable for most users.
Choose one of the following Python environment and package management tools:
uv
(Fast, Recommended)If you have uv
installed (or install it via pip install uv
):
Create and activate a virtual environment with your desired Python version:
uv works best when operating within a virtual environment. To create one (e.g., named .venv
) using a specific Python interpreter (e.g., Python 3.9), run:
uv venv .venv -p 3.9
Replace 3.9
with your target Python version (e.g., 3.10
, 3.11
) or the full path to a Python executable.
Then, activate it:
source .venv/bin/activate
.venv\Scripts\Activate.ps1
.venv\Scripts\activate.bat
Install deephaven-mcp
:
uv pip install deephaven-mcp
This command installs deephaven-mcp
and its dependencies into the active virtual environment. If you skipped the explicit virtual environment creation step above, uv
might still create or use one automatically (typically .venv
in your current directory if UV_AUTO_CREATE_VENV
is not false
, or a globally managed one). In any case where a virtual environment is used (either explicitly created or automatically by uv
), ensure it remains active for manual command-line use of dh-mcp-systems-server
or dh-mcp-docs-server
, or if your LLM tool requires an active environment.
pip
and venv
.venv
):
python -m venv .venv
source .venv/bin/activate
.venv\Scripts\activate
deephaven-mcp
into the activated virtual environment:
pip install deephaven-mcp
Ensure this virtual environment is active in any terminal session where you intend to run dh-mcp-systems-server
or dh-mcp-docs-server
manually, or if your LLM tool requires an active environment when spawning these processes.deephaven_mcp.json
This section explains how to configure the Deephaven MCP Systems Server to connect to and manage your Deephaven Community Core instances and Deephaven Enterprise systems. This involves creating a systems session definition file and understanding how the server locates this file.
deephaven_mcp.json
FileThe Deephaven MCP Systems Server requires a JSON configuration file that describes the Deephaven Community Core worker instances and Deephaven Enterprise systems it can connect to.
{}
if no community sessions are to be configured."community"
with a nested "sessions"
key.
{}
) where each key is a unique session name (e.g., "local_session"
, "prod_cluster_1_session"
) and the value is a configuration object for that session. An empty object signifies no sessions are configured under this key.In addition to "community"
, the deephaven_mcp.json
file can optionally include an "enterprise"
key for configuring connections to Deephaven Enterprise instances. Within the "enterprise"
object, you can define a "systems"
key that maps system names to their configurations. The configuration details for both community.sessions
and enterprise.systems
are provided below.
The fields listed below pertain to community sessions. All community session fields are optional. Default values are applied by the server if a field is omitted. Configuration fields for enterprise systems are detailed in a subsequent section.
host
(string): Hostname or IP address of the Deephaven Community Core worker (e.g., "localhost"
).port
(integer): Port number for the worker connection (e.g., 10000
).auth_type
(string): Authentication type. Common values include:
"Anonymous"
: For no authentication (default if omitted)."Basic"
: For username/password authentication (requires auth_token
in "username:password"
format)."io.deephaven.authentication.psk.PskAuthenticationHandler"
for Pre-Shared Key authentication).auth_token
(string): The authentication token. For "Basic"
auth, this must be in "username:password"
format. For custom authenticators, this should conform to the specific requirements of that authenticator. Ignored when auth_type
is "Anonymous"
. Consult your Deephaven server's authentication documentation for specifics.auth_token_env_var
(string): Alternative to auth_token
- specifies the name of an environment variable containing the authentication token (e.g., "MY_AUTH_TOKEN"
). Mutually exclusive with auth_token
.never_timeout
(boolean): If true
, the MCP server will attempt to configure the session to this worker to never time out. Server-side configurations may still override this.session_type
(string): Specifies the type of session to create. Common values are "groovy"
or "python"
.use_tls
(boolean): Set to true
if the connection to the worker requires TLS/SSL.tls_root_certs
(string): Absolute path to a PEM file containing trusted root CA certificates for TLS verification. If omitted, system CAs might be used, or verification might be less strict depending on the client library.client_cert_chain
(string): Absolute path to a PEM file containing the client's TLS certificate chain. Used for client-side certificate authentication (mTLS).client_private_key
(string): Absolute path to a PEM file containing the client's private key. Used for client-side certificate authentication (mTLS).The enterprise
key with nested "systems"
in deephaven_mcp.json
is a dictionary mapping custom system names (e.g., "prod_cluster"
, "data_science_env"
) to their specific configuration objects. Each configuration object supports the following fields:
Required Fields:
connection_json_url
(string): URL to the Deephaven Enterprise server's connection.json
file (e.g., "https://enterprise.example.com/iris/connection.json"
). This file provides the necessary details for the client to connect to the server.auth_type
(string): Specifies the authentication method. Must be one of:
"password"
: For username/password authentication."private_key"
: For authentication using a private key (e.g., SAML or other private key-based auth).Conditional Fields (based on auth_type
):
auth_type
is "password"
:
username
(string): The username for authentication (required).password
(string): The password itself.password_env_var
(string): The name of an environment variable that holds the password (e.g., "MY_ENTERPRISE_PASSWORD"
).auth_type
is "private_key"
:
private_key_path
(string): The absolute path to the private key file (e.g., "/path/to/your/private_key.pem"
) (required).Note: All paths, like private_key_path
, should be absolute and accessible by the MCP server process.
deephaven_mcp.json
{
"community": {
"sessions": {
"my_local_deephaven": {
"host": "localhost",
"port": 10000,
"session_type": "python"
},
"psk_authenticated_session": {
"host": "localhost",
"port": 10001,
"auth_type": "io.deephaven.authentication.psk.PskAuthenticationHandler",
"auth_token": "your-shared-secret-key",
"session_type": "python"
},
"basic_auth_session": {
"host": "secure.deephaven.example.com",
"port": 10002,
"auth_type": "Basic",
"auth_token": "username:password",
"use_tls": true,
"tls_root_certs": "/path/to/community_root.crt"
}
}
},
"enterprise": {
"systems": {
"prod_cluster": {
"connection_json_url": "https://prod.enterprise.example.com/iris/connection.json",
"auth_type": "password",
"username": "your_username",
"password_env_var": "ENTERPRISE_PASSWORD"
},
"data_science_env": {
"connection_json_url": "https://data-science.enterprise.example.com/iris/connection.json",
"auth_type": "private_key",
"private_key_path": "/path/to/your/private_key.pem"
}
}
}
}
deephaven_mcp.json
The deephaven_mcp.json
file can contain sensitive information such as authentication tokens, usernames, and passwords. Ensure that this file is protected with appropriate filesystem permissions to prevent unauthorized access. For example, on Unix-like systems (Linux, macOS), you can restrict permissions to the owner only using the command:
chmod 600 /path/to/your/deephaven_mcp.json
deephaven_mcp.json
DH_MCP_CONFIG_FILE
(Informing the MCP Server)The DH_MCP_CONFIG_FILE
environment variable tells the Deephaven MCP Systems Server where to find your deephaven_mcp.json
file (detailed in The deephaven_mcp.json
File (Defining Your Community Sessions)). You will set this environment variable as part of the server launch configuration within your LLM tool, as detailed in the Configure Your LLM Tool to Use MCP Servers section.
When launched by an LLM tool, the MCP Systems Server process reads this variable to load your session definitions. For general troubleshooting or if you need to set other environment variables like PYTHONLOGLEVEL
(e.g., to DEBUG
for verbose logs), these are also typically set within the LLM tool's MCP server configuration (see Defining MCP Servers for Your LLM Tool (The mcpServers
JSON Object)).
This section details how to configure your LLM tool (e.g., Claude Desktop, GitHub Copilot) to launch and communicate with the Deephaven MCP Systems Server and the Deephaven MCP Docs Server. This involves providing a JSON configuration, known as the "mcpServers"
object, to your LLM tool.
LLM tools that support the Model Context Protocol (MCP) can be configured to use the Deephaven MCP Community and Docs Servers. The LLM tool's configuration will typically define how to start the necessary MCP server processes.
The MCP Systems Server, launched by your LLM tool, will attempt to connect to the Deephaven Community Core instances defined in your deephaven_mcp.json
file (pointed to by DH_MCP_CONFIG_FILE
as described in Setting DH_MCP_CONFIG_FILE
(Informing the MCP Server)).
It's important to understand the following:
deephaven_mcp.json
file accurately lists the systems session configurations you intend to use. The MCP server uses this configuration to know which sessions to attempt to manage.mcpServers
JSON Object)Your LLM tool requires a specific JSON configuration to define how MCP servers are launched. This configuration is structured as a JSON object with a top-level key named "mcpServers"
. This "mcpServers"
object tells the tool how to start the Deephaven MCP Systems Server (for interacting with Deephaven Community Core) and the mcp-proxy
(for interacting with the Docs Server).
Depending on your LLM tool, this "mcpServers"
object might be:
mcp.json
in VS Code).Consult your LLM tool's documentation for the precise file name and location. Below are two examples of the "mcpServers"
JSON structure. Choose the one that matches your Python environment setup (either uv
or pip + venv
).
Important: All paths in the JSON examples (e.g., /full/path/to/...
) must be replaced with actual, absolute paths on your system.
"mcpServers"
object for uv
users:{
"mcpServers": {
"deephaven-systems": {
"command": "uv",
"args": [
"--directory",
"/full/path/to/deephaven-mcp",
"run",
"dh-mcp-systems-server"
],
"env": {
"DH_MCP_CONFIG_FILE": "/full/path/to/your/deephaven_mcp.json",
"PYTHONLOGLEVEL": "INFO"
}
},
"deephaven-docs": {
"command": "uv",
"args": [
"--directory",
"/full/path/to/deephaven-mcp",
"run",
"mcp-proxy",
"--transport=streamablehttp",
"https://deephaven-mcp-docs-prod.dhc-demo.deephaven.io/mcp"
]
}
}
}
Note: You can change "PYTHONLOGLEVEL": "INFO"
to "PYTHONLOGLEVEL": "DEBUG"
for more detailed server logs, as further detailed in the Troubleshooting section.
"mcpServers"
object for pip + venv
users:{
"mcpServers": {
"deephaven-systems": {
"command": "/full/path/to/your/deephaven-mcp/.venv/bin/dh-mcp-systems-server",
"args": [],
"env": {
"DH_MCP_CONFIG_FILE": "/full/path/to/your/deephaven_mcp.json",
"PYTHONLOGLEVEL": "INFO"
}
},
"deephaven-docs": {
"command": "/full/path/to/your/deephaven-mcp/.venv/bin/mcp-proxy",
"args": [
"--transport=streamablehttp",
"https://deephaven-mcp-docs-prod.dhc-demo.deephaven.io/mcp"
]
}
}
}
Note: You can change "PYTHONLOGLEVEL": "INFO"
to "PYTHONLOGLEVEL": "DEBUG"
for more detailed server logs, as further detailed in the Troubleshooting section.
mcpServers
ConfigurationThe "mcpServers"
JSON object, whose structure is detailed in Defining MCP Servers for Your LLM Tool (The mcpServers
JSON Object), needs to be placed in a specific configuration file or setting area for your LLM tool. Here’s how to integrate it with common tools:
mcpServers
object should be added to the main JSON object within this file:~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
(e.g., C:\Users\<YourUsername>\AppData\Roaming\Claude\claude_desktop_config.json
)~/.config/Claude/claude_desktop_config.json
.vscode/mcp.json
."mcpServers"
JSON object, as shown in the examples in Defining MCP Servers for Your LLM Tool (The mcpServers
JSON Object).Once you have saved the "mcpServers"
JSON object in the correct location for your LLM tool, restart the tool (Claude Desktop, VS Code, JetBrains IDEs, etc.). The configured servers (e.g., deephaven-systems
, deephaven-docs
) should then be available in its MCP interface.
After restarting your LLM tool, the first step is to verify that the MCP servers are recognized:
deephaven-systems
and deephaven-docs
(or the names you configured in the mcpServers
object) listed.deephaven-systems
server).If the servers are not listed or you encounter errors at this stage, please proceed to the Troubleshooting section for guidance.
DH_MCP_CONFIG_FILE
environment variable is correctly set in the JSON config and points to a valid worker file.deephaven_mcp.json
) are running and accessible from the MCP Systems Server's environment.mcpServers
object in the LLM tool, and deephaven_mcp.json
). A misplaced comma or incorrect quote can prevent the configuration from being parsed correctly. Use a JSON validator tool or your IDE's linting features.
PYTHONLOGLEVEL=DEBUG
in the env
block of your JSON config to get more detailed logs from the MCP servers. For example, Claude Desktop often saves these to files like ~/Library/Logs/Claude/mcp-server-SERVERNAME.log
. Consult your LLM tool's documentation for specific log file locations.mcp-proxy
's target URL ([https://deephaven-mcp-docs-prod.dhc-demo.deephaven.io](https://deephaven-mcp-docs-prod.dhc-demo.deephaven.io)
) if using the Docs Server.
* Test basic network connectivity (e.g., using ping
or curl
from the relevant machine) if connections are failing.command not found
for uv
(in LLM tool logs):
uv
is installed and its installation directory is in your system's PATH
environment variable, accessible by the LLM tool.command not found
for dh-mcp-systems-server
or mcp-proxy
(venv option in LLM tool logs):
command
field in your JSON config uses the correct absolute path to the executable within your .venv/bin/
(or .venv\Scripts\
) directory.deephaven_mcp.json
file (see The deephaven_mcp.json
File (Defining Your Community Sessions) for details on its structure and content).
* Ensure the target Deephaven Community Core instances are running and network-accessible.
* Confirm that the process running the MCP Systems Server has read permissions for the deephaven_mcp.json
file itself.We warmly welcome contributions to Deephaven MCP! Whether it's bug reports, feature suggestions, documentation improvements, or code contributions, your help is valued.
refresh
, table_schemas
) and the Docs Server (docs_chat
), refer to the Developer & Contributor Guide.uv
Workflow: For more details on using uv
for project management, see docs/UV.md.This project is licensed under the Apache 2.0 License. See the LICENSE file for details.
A test server for local MCP development and setup.
Provides structured data for shadcn/ui components, including descriptions, installation instructions, usage examples, and props.
Control, inspect, and mutate Bevy applications with AI coding assistants via the Bevy Remote Protocol (BRP).
Interact with the Qase API for test management. Requires a QASE_API_TOKEN for authentication.
Search Android Vehicle Hardware Abstraction Layer (vHAL) documentation and source code from a local repository clone.
An open-source library to connect any LLM to any MCP server, enabling the creation of custom agents with tool access.
Analyzes your codebase identifying important files based on dependency relationships. Generates diagrams and importance scores per file, helping AI assistants understand the codebase. Automatically parses popular programming languages, Python, Lua, C, C++, Rust, Zig.
An agentic communication framework for multi-agent collaboration using MCP.
Integrates with Google AI Studio/Gemini API for PDF to Markdown conversion and content generation.
MCP server for interacting with the Godot game engine, providing tools for editing, running, debugging, and managing scenes in Godot projects.