Mermaid
Generate mermaid diagram and chart with AI MCP dynamically.
MCP Mermaid

Generate mermaid diagram and chart with AI MCP dynamically. Also you can use mcp-server-chart to generate chart, graph, map.
✨ Features
- Fully support all features and syntax of
Mermaid. - Support configuration of
backgroundColorandtheme, enabling large AI models to output rich style configurations. - Support exporting to
base64,svg,mermaid, andfileformats, with validation forMermaidto facilitate the model's multi-round output of correct syntax and graphics. UseoutputType: "file"to automatically save PNG diagrams to disk for AI agents.
🤖 Usage
To use with Desktop APP, such as Claude, VSCode, Cline, Cherry Studio, and so on, add the MCP server config below. On Mac system:
{
"mcpServers": {
"mcp-mermaid": {
"command": "npx",
"args": [
"-y",
"mcp-mermaid"
]
}
}
}
On Window system:
{
"mcpServers": {
"mcp-mermaid": {
"command": "cmd",
"args": [
"/c",
"npx",
"-y",
"mcp-mermaid"
]
}
}
}
Also, you can use it on aliyun, modelscope, glama.ai, smithery.ai or others with HTTP, SSE Protocol.
🚰 Run with SSE or Streamable transport
Option 1: Global Installation
Install the package globally:
npm install -g mcp-mermaid
Run the server with your preferred transport option:
# For SSE transport (default endpoint: /sse)
mcp-mermaid -t sse
# For Streamable transport with custom endpoint
mcp-mermaid -t streamable
Option 2: Local Development
If you're working with the source code locally:
# Clone and setup
git clone https://github.com/hustcc/mcp-mermaid.git
cd mcp-mermaid
npm install
npm run build
# Run with npm scripts
npm run start:sse # SSE transport on port 3033
npm run start:streamable # Streamable transport on port 1122
Access Points
Then you can access the server at:
- SSE transport:
http://localhost:3033/sse - Streamable transport:
http://localhost:1122/mcp(local) orhttp://localhost:3033/mcp(global)
🎮 CLI Options
You can also use the following CLI options when running the MCP server. Command options by run cli with -h.
MCP Mermaid CLI
Options:
--transport, -t Specify the transport protocol: "stdio", "sse", or "streamable" (default: "stdio")
--port, -p Specify the port for SSE or streamable transport (default: 3033)
--endpoint, -e Specify the endpoint for the transport:
- For SSE: default is "/sse"
- For streamable: default is "/mcp"
--help, -h Show this help message
🔨 Development
Install dependencies:
npm install
Build the server:
npm run build
Start the MCP server
Using MCP Inspector (for debugging):
npm run start
Using different transport protocols:
# SSE transport (Server-Sent Events)
npm run start:sse
# Streamable HTTP transport
npm run start:streamable
Direct node commands:
# SSE transport on port 3033
node build/index.js --transport sse --port 3033
# Streamable HTTP transport on port 1122
node build/index.js --transport streamable --port 1122
# STDIO transport (for MCP client integration)
node build/index.js --transport stdio
📄 License
MIT@hustcc.
Related Servers
Android Preference Editor
Edit Android preferences using adb and Node.js.
Remote MCP Server (Authless)
An example of a remote MCP server deployable on Cloudflare Workers without authentication.
Gemini MCP
An MCP server that orchestrates Google Gemini and Claude Code models via the OpenRouter API.
MCP Gemini CLI
Integrate with Google Gemini through its command-line interface (CLI).
Tencent Cloud Code Analysis
An official MCP server for Tencent Cloud Code Analysis (TCA) to quickly start code analysis and obtain reports.
Cognitive Enhancement MCP Servers
A collection of MCP servers that provide cognitive enhancement tools for large language models.
USolver
A server for solving combinatorial, convex, integer, and non-linear optimization problems.
Docker MCP
A Ruby implementation of an MCP server for managing and using Docker
Blockchain MCP Server
A server for blockchain interactions, offering Ethereum vanity address generation, 4byte lookup, ABI encoding, and multi-chain RPC calls.
Remote MCP Server on Cloudflare
A remote MCP server deployable on Cloudflare Workers with OAuth login support, designed for both local development and cloud deployment.