Integrates with MLflow, enabling AI assistants to interact with experiments, runs, and registered models.
A Model Context Protocol (MCP) server that provides seamless integration with MLflow, enabling AI assistants to interact with MLflow experiments, runs, and registered models.
This MCP server exposes MLflow functionality through a standardized protocol, allowing AI assistants like Claude to:
git clone <repository-url>
cd mlflow-mcp-server
uv sync
The server is pre-configured to connect to your internal MLflow instance:
YOUR URI
To use with a different MLflow instance, modify mlflow_mcp_server/utils/mlflow_client.py
:
import mlflow
from mlflow import MlflowClient
mlflow.set_tracking_uri("your-mlflow-tracking-uri")
client = MlflowClient()
Add the following configuration to your MCP client (e.g., ~/.cursor/mcp.json
for Cursor):
{
"mcpServers": {
"mlflow": {
"command": "uvx",
"args": ["--from", "git+https://github.com/yesid-lopez/mlflow-mcp-server", "mlflow_mcp_server"],
"env": {
"MLFLOW_TRACKING_URI": "YOUR_TRACKING_URI"
}
}
}
}
Replace /path/to/mlflow-mcp-server
with the actual path to your project directory.
uv run -m mlflow_mcp_server
Once configured, the following tools become available to your AI assistant:
get_experiment(experiment_id: str)
- Get experiment details by IDget_experiment_by_name(experiment_name: str)
- Get experiment by namesearch_experiments(name?: str, token?: str)
- Search experiments with optional filteringget_run(run_id: str)
- Get detailed run informationget_experiment_runs(experiment_id: str, token?: str)
- List runs for an experimentget_registered_models(model_name?: str, token?: str)
- Search registered modelsget_model_versions(model_name?: str, token?: str)
- Browse model versionsYou can now ask your AI assistant questions like:
mlflow-mcp-server/
├── mlflow_mcp_server/
│ ├── __main__.py # Server entry point
│ ├── server.py # Main MCP server configuration
│ ├── tools/ # MLflow integration tools
│ │ ├── experiment_tools.py
│ │ ├── run_tools.py
│ │ └── registered_models.py
│ └── utils/
│ └── mlflow_client.py # MLflow client configuration
├── pyproject.toml # Project dependencies
└── README.md
To add new MLflow functionality:
server.py
:
from mlflow_mcp_server.tools.your_module import your_function
mcp.add_tool(your_function)
[Add your license information here]
For issues and questions:
An intelligent codebase search engine that transforms local codebases into a natural language queryable knowledge base.
Official Zeplin server for AI-assisted UI development.
Integrates with Google AI Studio/Gemini API for PDF to Markdown conversion and content generation.
An MCP client for Cursor that uses OpenRouter.ai to access multiple AI models. Requires an OpenRouter API key.
Read/write to over 2k blockchains, enabling data querying, contract analysis/deployment, and transaction execution, powered by Thirdweb.
Manage background processes for AI agents using the Model Context Protocol (MCP).
Parses HAR (HTTP Archive) files and displays requests in a simplified format for AI assistants.
Interact with the Monad testnet, query blockchain data, and engage with the CoinflipGame smart contract.
Control emulators by opening/closing apps, capturing screenshots, and interacting with the screen.
Enable AI agents to secure code with Semgrep.