Interact with your MLOps and LLMOps pipelines through your ZenML MCP server
This project implements a Model Context Protocol (MCP) server for interacting with the ZenML API.
The Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to Large Language Models (LLMs). It acts like a "USB-C port for AI applications" - providing a standardized way to connect AI models to different data sources and tools.
MCP follows a client-server architecture where:
ZenML is an open-source platform for building and managing ML and AI pipelines. It provides a unified interface for managing data, models, and experiments.
For more information, see the ZenML website and our documentation.
The server provides MCP tools to access core read functionality from the ZenML server, providing a way to get live information about:
It also allows you to trigger new pipeline runs (if a run template is present).
Note: This is a beta/experimental release. We're still exploring how people will use this integration, so we welcome your feedback and suggestions! Please join our Slack community to share your experience and help us improve.
You will need to have access to a ZenML Cloud server. If you don't have one, you can sign up for a free trial at ZenML Cloud.
You will also need to have uv
installed locally. For more information, see
the uv
documentation.
We recommend installation via their installer script or via brew
if using a
Mac.
You will also need to clone this repository somewhere locally:
git clone https://github.com/zenml-io/mcp-zenml.git
The MCP config file is a JSON file that tells the MCP client how to connect to your MCP server. Different MCP clients will use or specify this differently. Two commonly-used MCP clients are Claude Desktop and Cursor, for which we provide installation instructions below.
You will need to specify your ZenML MCP server in the following format:
{
"mcpServers": {
"zenml": {
"command": "/usr/local/bin/uv",
"args": ["run", "path/to/zenml_server.py"],
"env": {
"LOGLEVEL": "INFO",
"NO_COLOR": "1",
"PYTHONUNBUFFERED": "1",
"PYTHONIOENCODING": "UTF-8",
"ZENML_STORE_URL": "https://your-zenml-server-goes-here.com",
"ZENML_STORE_API_KEY": "your-api-key-here"
}
}
}
}
There are four dummy values that you will need to replace:
uv
(the path listed above is where it
would be on a Mac if you installed it via brew
)zenml_server.py
file (this is the file that will be run when
you connect to the MCP server). This file is located inside this repository at
the root. You will need to specify the exact full path to this file.https://d534d987a-zenml.cloudinfra.zenml.io
.You are free to change the way you run the MCP server Python file, but using
uv
will probably be the easiest option since it handles the environment and
dependency installation for you.
You will need to have Claude Desktop installed.
Once you have installed and opened Claude Desktop, you need to open the 'Settings' menu and click on the 'Developer' tab. There will be an 'Edit Config' button which will open up a file explorer showing you the location of your config file.
You should paste the contents of the (properly filled in) config file above into the JSON file revealed in the file explorer. Then just restart Claude Desktop and it will use the new config. You should be able to see the ZenML server in the developer settings menu. Chat with Claude and it will use all the new tools you just gave it access to.
For a better experience with ZenML tool results, you can configure Claude to display the JSON responses in a more readable format. In Claude Desktop, go to Settings → Profile, and in the "What personal preferences should Claude consider in responses?" section, add something like the following (or use these exact words!):
When using zenml tools which return JSON strings and you're asked a question, you might want to consider using markdown tables to summarize the results or make them easier to view!
This will encourage Claude to format ZenML tool outputs as markdown tables, making the information much easier to read and understand.
You will need to have Cursor installed.
Cursor works slightly differently to Claude Desktop in that you specify the config file on a per-repository basis. This means that if you want to use the ZenML MCP server in multiple repos, you will need to specify the config file in each of them.
To set it up for a single repository, you will need to:
.cursor
folder in the root of your repositorymcp.json
file with the content aboveIn our experience, sometimes it shows a red error indicator even though it is working. You can try it out by chatting in the Cursor chat window. It will let you know if is able to access the ZenML tools or not.
Retrieving and analyzing issues from Sentry.io
Create crafted UI components inspired by the best 21st.dev design engineers.
ALAPI MCP Tools,Call hundreds of API interfaces via MCP
APIMatic MCP Server is used to validate OpenAPI specifications using APIMatic. The server processes OpenAPI files and returns validation summaries by leveraging APIMatic’s API.
Flag features, manage company data, and control feature access using Bucket
Enable AI Agents to fix build failures from CircleCI.
Query and analyze your Opik logs, traces, prompts and all other telemtry data from your LLMs in natural language.
Run code in secure sandboxes hosted by E2B
Tool platform by IBM to build, test and deploy tools for any data source
Run Python in a code sandbox.