Mermaid
Generate mermaid diagram and chart with AI MCP dynamically.
MCP Mermaid

Generate mermaid diagram and chart with AI MCP dynamically. Also you can use mcp-server-chart to generate chart, graph, map.
✨ Features
- Fully support all features and syntax of
Mermaid. - Support configuration of
backgroundColorandtheme, enabling large AI models to output rich style configurations. - Support exporting to
base64,svg,mermaid, andfileformats, with validation forMermaidto facilitate the model's multi-round output of correct syntax and graphics. UseoutputType: "file"to automatically save PNG diagrams to disk for AI agents.
🤖 Usage
To use with Desktop APP, such as Claude, VSCode, Cline, Cherry Studio, and so on, add the MCP server config below. On Mac system:
{
"mcpServers": {
"mcp-mermaid": {
"command": "npx",
"args": [
"-y",
"mcp-mermaid"
]
}
}
}
On Window system:
{
"mcpServers": {
"mcp-mermaid": {
"command": "cmd",
"args": [
"/c",
"npx",
"-y",
"mcp-mermaid"
]
}
}
}
Also, you can use it on aliyun, modelscope, glama.ai, smithery.ai or others with HTTP, SSE Protocol.
🚰 Run with SSE or Streamable transport
Option 1: Global Installation
Install the package globally:
npm install -g mcp-mermaid
Run the server with your preferred transport option:
# For SSE transport (default endpoint: /sse)
mcp-mermaid -t sse
# For Streamable transport with custom endpoint
mcp-mermaid -t streamable
Option 2: Local Development
If you're working with the source code locally:
# Clone and setup
git clone https://github.com/hustcc/mcp-mermaid.git
cd mcp-mermaid
npm install
npm run build
# Run with npm scripts
npm run start:sse # SSE transport on port 3033
npm run start:streamable # Streamable transport on port 1122
Access Points
Then you can access the server at:
- SSE transport:
http://localhost:3033/sse - Streamable transport:
http://localhost:1122/mcp(local) orhttp://localhost:3033/mcp(global)
🎮 CLI Options
You can also use the following CLI options when running the MCP server. Command options by run cli with -h.
MCP Mermaid CLI
Options:
--transport, -t Specify the transport protocol: "stdio", "sse", or "streamable" (default: "stdio")
--port, -p Specify the port for SSE or streamable transport (default: 3033)
--endpoint, -e Specify the endpoint for the transport:
- For SSE: default is "/sse"
- For streamable: default is "/mcp"
--help, -h Show this help message
🔨 Development
Install dependencies:
npm install
Build the server:
npm run build
Start the MCP server
Using MCP Inspector (for debugging):
npm run start
Using different transport protocols:
# SSE transport (Server-Sent Events)
npm run start:sse
# Streamable HTTP transport
npm run start:streamable
Direct node commands:
# SSE transport on port 3033
node build/index.js --transport sse --port 3033
# Streamable HTTP transport on port 1122
node build/index.js --transport streamable --port 1122
# STDIO transport (for MCP client integration)
node build/index.js --transport stdio
📄 License
MIT@hustcc.
Related Servers
Safe Local Python Executor
A tool for safely executing local Python code without requiring external data files.
SACL MCP Server
A framework for bias-aware code retrieval using semantic-augmented reranking and localization.
Claude MCP Tools
An MCP server ecosystem for integrating with Anthropic's Claude Desktop and Claude Code CLI.
Jupyter Notebook MCP Server
Interact with Jupyter notebooks, allowing for code execution, cell manipulation, and notebook management.
Remote MCP Server (Authless)
An example of a remote MCP server deployable on Cloudflare Workers without authentication.
ActionKit MCP Starter
A demonstration server for ActionKit, providing access to Slack actions via Claude Desktop.
Prometheus MCP
Expose Prometheus monitoring tools to an LLM for querying and analysis.
Remote MCP Server (Authless)
An example of a remote MCP server deployable on Cloudflare Workers, without authentication.
Onyx MCP Server
Search and query Onyx programming language documentation and GitHub code examples.
TCC
Automatically generates MCP servers from OpenAPI specifications, enabling conversational AI agents to interact with existing web systems.