Generate mermaid diagram and chart with AI MCP dynamically.
Generate mermaid diagram and chart with AI MCP dynamically. Also you can use mcp-server-chart to generate chart, graph, map.
Mermaid
.backgroundColor
and theme
, enabling large AI models to output rich style configurations.png
, svg
, and mermaid
formats, with validation for Mermaid
to facilitate the model's multi-round output of correct syntax and graphics.To use with Desktop APP
, such as Claude, VSCode, Cline, Cherry Studio, and so on, add the MCP server config below. On Mac system:
{
"mcpServers": {
"mcp-mermaid": {
"command": "npx",
"args": [
"-y",
"mcp-mermaid"
]
}
}
}
On Window system:
{
"mcpServers": {
"mcp-mermaid": {
"command": "cmd",
"args": [
"/c",
"npx",
"-y",
"mcp-mermaid"
]
}
}
}
Also, you can use it on aliyun, modelscope, glama.ai, smithery.ai or others with HTTP, SSE Protocol.
Install the package globally.
npm install -g mcp-mermaid
Run the server with your preferred transport option:
# For SSE transport (default endpoint: /sse)
mcp-mermaid -t sse
# For Streamable transport with custom endpoint
mcp-mermaid -t streamable
Then you can access the server at:
http://localhost:3033/sse
http://localhost:3033/mcp
You can also use the following CLI options when running the MCP server. Command options by run cli with -h
.
MCP Mermaid CLI
Options:
--transport, -t Specify the transport protocol: "stdio", "sse", or "streamable" (default: "stdio")
--port, -p Specify the port for SSE or streamable transport (default: 3033)
--endpoint, -e Specify the endpoint for the transport:
- For SSE: default is "/sse"
- For streamable: default is "/mcp"
--help, -h Show this help message
Install dependencies:
npm install
Build the server:
npm run build
Start the MCP server:
npm run start
MIT@hustcc.
A server for generating version 7 universally unique identifiers (UUIDv7).
MCP server empowers LLMs to interact with JSON files efficiently. With JSON MCP, you can split, merge, etc.
A server to query the Python Package Index (PyPI) for package information, dependencies, and compatibility.
Execute shell commands without permission prompts.
Aggregates multiple MCP resource servers into a single interface with stdio/sse support.
An MCP server providing searchable access to multiple AI/ML SDK documentation and source code.
An MCP server that enables Large Language Models to make HTTP requests and interact with web APIs. It supports automatic tool generation from OpenAPI/Swagger specifications.
A lightweight, LLM-agnostic RESTful proxy that unifies multiple MCP servers under a single API.
Connects Blender to Claude AI via the Model Context Protocol (MCP), enabling direct interaction and control for prompt-assisted 3D modeling, scene creation, and manipulation.
Performs gene set enrichment analysis using the Enrichr API, supporting all available gene set libraries.