UML-MCP
A diagram generation server supporting multiple UML and other diagram types, with various output formats. It integrates with rendering services like Kroki and PlantUML.
UML-MCP: Diagram Generation Server with MCP Interface
UML-MCP is a diagram generation server that implements the Model Context Protocol (MCP), so you can create diagrams from AI assistants and other MCP clients. Improve your diagramming skills with PlantUML, Mermaid, and D2 by creating optimized diagrams for various purposes (communication, design, documentation, etc.) using Class, Sequence, Activity, Use Case, State, Component, Deployment, Object, and more.
Live: MCP endpoint · Add via Smithery
Features
- Multiple diagram types: UML (Class, Sequence, Activity, Use Case, State, Component, Deployment, Object), Mermaid, D2, Graphviz, TikZ, ERD, BlockDiag, BPMN, C4 with PlantUML
- MCP integration: Works with any client that supports MCP (Cursor, Claude Desktop, etc.)
- Output formats: SVG, PNG, PDF, JPEG (where supported by type), plus txt/base64 for some backends; optional scale for SVG
- Configurable backends: Local or remote PlantUML and Kroki
- Automatic fallback: Always tries Kroki first; if unavailable, falls back to alternative rendering services (PlantUML server for UML diagrams, Mermaid.ink for Mermaid diagrams)
Supported diagram types
| Category | Diagram types |
|---|---|
| UML | Class, Sequence, Activity, Use Case, State, Component, Deployment, Object |
| Other | Mermaid, D2, Graphviz, TikZ, ERD, BlockDiag, BPMN, C4 (PlantUML) |
Getting started
Prerequisites
Installation
Installing via Smithery
To install UML Model Context Protocol for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install antoinebou12/uml --client claude
Manual installation
With uv (recommended, modern Python):
git clone https://github.com/antoinebou12/uml-mcp.git
cd uml-mcp
uv sync
With Poetry:
git clone https://github.com/antoinebou12/uml-mcp.git
cd uml-mcp
poetry install
With pip:
git clone https://github.com/antoinebou12/uml-mcp.git
cd uml-mcp
pip install -e .
For development (tests, linting, type checking) you must install dev dependencies:
uv sync --all-groups
# or: poetry install --with dev
# or: pip install -e ".[dev]"
Without --all-groups, tools like black, flake8, isort are not installed and uv run python -m black will fail with "No module named black". Dev tools: ruff (lint + format), ty (type check), pytest, pytest-cov, black, flake8, isort, pre-commit.
Running the server
MCP server (single CLI entrypoint): server.py is the official MCP server; the FastAPI app in app.py provides both the REST API and the MCP over HTTP endpoint at /mcp for Vercel and Smithery.
Canonical entry point (MCP server using mcp_core and Kroki):
From the project root:
python server.py
Or with uv / Poetry:
uv run python server.py
# or: poetry run python server.py
# or: poetry run uml-mcp
You can also use the FastMCP CLI when fastmcp.json is present: run fastmcp run or fastmcp run fastmcp.json to start the server using the config file (CLI options override the config). For the full local CLI with options like --list-tools, use python server.py instead.
The server uses stdio by default. For HTTP:
python server.py --transport http --host 127.0.0.1 --port 8000
List available tools and exit:
python server.py --list-tools
Deploy to Vercel and publish on Smithery
To expose the server over HTTP so anyone can connect without installing (e.g. via Smithery):
- Deploy to Vercel (connect this repo;
vercel.jsonis already configured). - Your MCP URL will be:
https://<your-project>.vercel.app/mcp. Use the/mcppath—not the root URL (e.g.https://...vercel.app/mcp, nothttps://...vercel.app). - Publish on Smithery: go to smithery.ai/new, choose URL (bring your own hosting), enter your MCP Server URL
https://<your-project>.vercel.app/mcp, and complete the flow (Namespace: e.g.antoinebou12, Server ID:uml).
See docs/integrations/vercel_smithery.md for step-by-step instructions. If Smithery shows a 401 or "Invalid OAuth" error, your Vercel project likely has Deployment Protection on; see the Troubleshooting section there (disable protection or use a bypass token). On Vercel, if the MCP cannot write to disk, it still returns the Kroki URL; you can also use POST /kroki_encode to get a diagram URL without file write.
Improve your Smithery listing: After publishing, open Settings → General on your server’s Smithery page. Set Display name, Description, Homepage (e.g. this repo or https://umlmcp.vercel.app), and Server icon to improve discoverability and the Server Metadata score. For better Configuration UX, publish with a config schema: [smithery.ai/new](https://smithery.ai/new) (web); add config from smithery-config-schema.json in the server's Settings (see vercel_smithery.md).
Configuration
MCP client setup
Example MCP server configs are in the config/ folder:
config/cursor_config.json— snippet for Cursorconfig/claude_desktop_config.json— snippet for Claude Desktop
Copy the relevant block into your client’s config and replace /path/to/uml-mcp with the real path to this repo. See config/README.md for where each app stores its config.
Environment variables
| Variable | Description | Default |
|---|---|---|
MCP_OUTPUT_DIR | Directory for generated diagrams | ./output |
UML_MCP_OUTPUT_DIR | Same as above (alternative name) | — |
KROKI_SERVER | Kroki server URL | https://kroki.io |
PLANTUML_SERVER | PlantUML server URL | http://plantuml-server:8080 |
USE_LOCAL_KROKI | Use local Kroki (true/false) | false |
USE_LOCAL_PLANTUML | Use local PlantUML (true/false) | false |
LOG_LEVEL | Logging level | — |
LIST_TOOLS | Set to true to list tools and exit | — |
Full options are documented in docs/configuration.md and docs/installation.md.
Local development
Run PlantUML and/or Kroki locally (e.g. with Docker):
# PlantUML
docker run -d -p 8080:8080 plantuml/plantuml-server
# Kroki
docker run -d -p 8000:8000 yuzutech/kroki
Then:
export USE_LOCAL_PLANTUML=true
export PLANTUML_SERVER=http://localhost:8080
export USE_LOCAL_KROKI=true
export KROKI_SERVER=http://localhost:8000
python server.py
Architecture
- MCP server:
server.py(entry point),mcp_core/(server, config, tools, resources, prompts) - Diagram backends:
plantuml/,kroki/,mermaid/,D2/
Diagram Generation Strategy
The system uses an intelligent fallback mechanism to ensure maximum reliability:
- Primary method (Kroki): All diagram generation requests first attempt to use Kroki.io - a unified API that supports 30+ diagram types
- Automatic fallback: If Kroki is unavailable or fails (network issues, service downtime, etc.), the system automatically falls back to alternative rendering services:
- PlantUML diagrams (Class, Sequence, Activity, etc.) → Falls back to the configured PlantUML server (default:
http://plantuml-server:8080) - Mermaid diagrams → Falls back to Mermaid.ink
- Other diagram types → Returns an error with details from both attempts
- PlantUML diagrams (Class, Sequence, Activity, etc.) → Falls back to the configured PlantUML server (default:
This ensures your diagrams are generated even if the primary service is temporarily unavailable. Configure fallback servers using the PLANTUML_SERVER environment variable.
- Tools: Diagram generation tools are registered in
mcp_core/tools/and exposed via MCP - Resources: Templates and examples under the
uml://URI scheme
MCP resources and tools
Resources (e.g. uml://types, uml://templates, uml://examples, uml://formats, uml://server-info, uml://workflow)
Provide diagram types, templates, examples, formats, server info, and the recommended workflow for complex diagrams.
Tools:
generate_uml— Generate a diagram and optionally save it (params:diagram_type,code,output_dir,output_format,theme,scale). Omitoutput_dirto get URL and base64 only.generate_diagram_url— Return the diagram URL and base64 image without writing a file. Same diagram types; nooutput_dir.
See docs/api/tools.md for full parameters.
Better results for complex diagrams
The default prompts (uml_diagram, uml_diagram_with_thinking) instruct the model to plan first (decide diagram type, purpose, elements, relationships), then output the diagram code and call generate_uml with the chosen diagram_type and final code. The resource uml://workflow describes this flow.
Tests
See Testing for why integration tests are skipped by default and how to run them. Use the project’s venv so pytest-cov and options from pyproject.toml apply:
uv run pytest tests/ -v
# or: poetry run pytest tests/ -v
Do not run bare pytest (system Python); it may not have pytest-cov and will not see the project’s addopts.
Integration tests (MCP Client) use the real FastMCP package and an in-process client to test discovery and tools. They are skipped unless USE_REAL_FASTMCP=1 is set. Run them with:
USE_REAL_FASTMCP=1 uv run pytest tests/integration -v
These tests require a fastmcp version that provides fastmcp.client.Client (e.g. FastMCP 2.x / 3.x). Diagram generation is mocked so no network call to Kroki is made.
Lint and type check
uv run ruff check .
uv run ruff format --check .
# or: make lint
# Optional type checking (gradual adoption):
uv run ty check
# or: make typecheck
- If you see No module named black (or flake8/isort), run
uv sync --all-groupsso dev tools are installed. - If you get Permission denied when running
uv run black/flake8/isort(e.g. on WSL with project on/mnt/c/), use the module form:uv run python -m black ...,uv run python -m flake8 ...,uv run python -m isort ..., or fix execute bits:chmod +x .venv/bin/black .venv/bin/flake8 .venv/bin/isort.
Testing the CI pipeline locally
Option 1: Same steps as CI, no Docker
make ci
Runs lint (ruff) and tests with coverage, matching the test job in .github/workflows/ci.yml.
Option 2: act (GitHub Actions in Docker)
You can run the full workflow locally with act. Requires Docker. The main pipeline is in a single file (.github/workflows/ci.yml) so act push runs test and build jobs without reusable-workflow issues.
Install act (Windows): Chocolatey choco install act-cli -y, Scoop scoop install act, or download from act releases.
act -l
act push
Documentation
The documentation is built with MkDocs and the Material theme.
- Online: https://antoinebou12.github.io/uml-mcp/ (after enabling GitHub Pages)
- Local: Run
uv run mkdocs serveormake docs-serve(from project root), then open http://127.0.0.1:8000
Key docs (also in the static site):
- User Manual — Install, configure, and use (quick start, client setup, troubleshooting)
- Installation
- Configuration
- API / tools
- Cursor integration
- Claude Desktop integration
- Deploy to Vercel and Smithery
Contributing and community
- Contributing guide — setup, tests, and how to send pull requests
- Code of conduct
- Security policy — how to report vulnerabilities
License
MIT — see LICENSE.
Acknowledgements
Related Servers
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Ghidra MCP Server
Exposes binary analysis data from Ghidra, including functions and pseudocode, to LLMs.
xcsimctl
Manage Xcode simulators.
Limetest
A lightweight, AI-powered end-to-end testing framework for CI workflows. Requires an OpenAI API key.
Composer Package README MCP Server
Fetches comprehensive information about Composer packages from Packagist, including READMEs, metadata, and search functionality.
Context7
Provides up-to-date, version-specific documentation and code examples for libraries directly into your prompt.
MCP Documentation Server
Integrates LLM applications with documentation sources using the Model Context Protocol.
Postman MCP Server
Run Postman collections using Newman, with support for environment and global variables.
Data Engineering Tutor MCP Server
A tutor for Data Engineering that provides personalized updates on concepts, patterns, and technologies.
AI Studio MCP Server
Integrates with Google AI Studio/Gemini API for PDF to Markdown conversion and content generation.
Simple MCP Server
A starter MCP server built with TypeScript and the official MCP SDK, featuring example tools like echo, add, time, and flight status.
