Mermaid
Generate mermaid diagram and chart with AI MCP dynamically.
MCP Mermaid

Generate mermaid diagram and chart with AI MCP dynamically. Also you can use:
- mcp-server-chart to generate chart, graph, map.
- Infographic to generate infographic, such as
Timeline,Comparison,List,Processand so on.
✨ Features
-
Fully support all features and syntax of
Mermaid. -
Support configuration of
backgroundColorandtheme, enabling large AI models to output rich style configurations. -
Support exporting to
base64,svg,mermaid,file, and remote-friendlysvg_url,png_urlformats, with validation forMermaidto facilitate the model's multi-round output of correct syntax and graphics. UseoutputType: "file"to automatically save PNG diagrams to disk for AI agents, or the URL modes to share diagrams through public mermaid.ink links.
🤖 Usage
To use with Desktop APP, such as Claude, VSCode, Cline, Cherry Studio, and so on, add the MCP server config below. On Mac system:
{
"mcpServers": {
"mcp-mermaid": {
"command": "npx",
"args": [
"-y",
"mcp-mermaid"
]
}
}
}
On Window system:
{
"mcpServers": {
"mcp-mermaid": {
"command": "cmd",
"args": [
"/c",
"npx",
"-y",
"mcp-mermaid"
]
}
}
}
Also, you can use it on aliyun, modelscope, glama.ai, smithery.ai or others with HTTP, SSE Protocol.
🚰 Run with SSE or Streamable transport
Option 1: Global Installation
Install the package globally:
npm install -g mcp-mermaid
Run the server with your preferred transport option:
# For SSE transport (default endpoint: /sse)
mcp-mermaid -t sse
# For Streamable transport with custom endpoint
mcp-mermaid -t streamable
Option 2: Local Development
If you're working with the source code locally:
# Clone and setup
git clone https://github.com/hustcc/mcp-mermaid.git
cd mcp-mermaid
npm install
npm run build
# Run with npm scripts
npm run start:sse # SSE transport on port 3033
npm run start:streamable # Streamable transport on port 1122
Access Points
Then you can access the server at:
- SSE transport:
http://localhost:3033/sse - Streamable transport:
http://localhost:1122/mcp(local) orhttp://localhost:3033/mcp(global)
🎮 CLI Options
You can also use the following CLI options when running the MCP server. Command options by run cli with -h.
MCP Mermaid CLI
Options:
--transport, -t Specify the transport protocol: "stdio", "sse", or "streamable" (default: "stdio")
--port, -p Specify the port for SSE or streamable transport (default: 3033)
--endpoint, -e Specify the endpoint for the transport:
- For SSE: default is "/sse"
- For streamable: default is "/mcp"
--help, -h Show this help message
🔨 Development
Install dependencies:
npm install
Build the server:
npm run build
Start the MCP server
Using MCP Inspector (for debugging):
npm run start
Using different transport protocols:
# SSE transport (Server-Sent Events)
npm run start:sse
# Streamable HTTP transport
npm run start:streamable
Direct node commands:
# SSE transport on port 3033
node build/index.js --transport sse --port 3033
# Streamable HTTP transport on port 1122
node build/index.js --transport streamable --port 1122
# STDIO transport (for MCP client integration)
node build/index.js --transport stdio
📄 License
MIT@hustcc.
Related Servers
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
BoostSecurity
BoostSecurity MCP acts as a safeguard preventing agents from adding vulnerable packages into projects. It analyzes every package an AI agent introduces, flags unsafe dependencies, and recommends secure, maintained alternatives to keep projects protected.
NeuroDev MCP Server
A powerful Model Context Protocol (MCP) server that supercharges your Python development workflow with AI-powered code review, intelligent test generation, and comprehensive test execution.
Flutter Tools
Provides diagnostics and fixes for Dart and Flutter files. Requires the Flutter SDK.
browser-devtools-mcp
A Playwright-based MCP server that exposes a live browser as a traceable, inspectable, debuggable and controllable execution environment for AI agents.
Gemini MCP
An MCP server that orchestrates Google Gemini and Claude Code models via the OpenRouter API.
Figma
Access and interact with Figma files and prototypes directly from AI agents.
UnrealMCP Plugin
An unofficial MCP server plugin for remote control of Unreal Engine using AI tools.
llm-context
Share code context with LLMs via Model Context Protocol or clipboard.
SimpleLocalize
A MCP server for SimpleLocalize, a translation management system. Requires a SimpleLocalize API key.
Accordo MCP Server
Provides dynamic YAML-driven workflow guidance for AI coding agents with structured development workflows, progression control, and decision points.