A template for deploying a remote, authentication-free MCP server on Cloudflare Workers. Tools are defined directly in the source code.
An AI-powered Model Context Protocol (MCP) server that provides intelligent access to design systems knowledge. This server ingests design system documentation (PDFs, web content) and enables AI assistants to provide expert guidance on design systems, components, tokens, and best practices.
๐ Live Demo: https://design-systems-mcp.southleft.com/
Workers Domain: https://design-systems-mcp.southleft-llc.workers.dev
https://design-systems-mcp.southleft-llc.workers.dev/mcp
https://design-systems-mcp.southleft-llc.workers.dev/health
Visit the live demo and ask questions like:
Clone and Install
git clone https://github.com/southleft/design-systems-mcp.git
cd design-systems-mcp
npm install
Configure Environment
cp env.example .dev.vars
# Edit .dev.vars and add your OpenAI API key
Start Development Server
npm run dev
Server will be available at: http://localhost:8787
Test the AI Chat Interface
http://localhost:8787
in your browserIngest Content (if not already done)
# Add PDFs to local-content-library/
npm run ingest:pdf path/to/your-design-guide.pdf
# Or ingest web content
npm run ingest:url https://example.com/design-system
# Or bulk ingest from CSV file
npm run ingest:csv path/to/urls.csv
Update Content Loading in src/index.ts
// Add new content files using dynamic imports
const [handbookModule, buttonModule, newContentModule] = await Promise.all([
import('../content/entries/8zWJWrDK_bTOv3_KFo30V-pdf-designsystemshandbook-pdf.json'),
import('../content/entries/sample-button-guidelines.json'),
import('../content/entries/your-new-content.json')
]);
const actualEntries = [
handbookModule.default as ContentEntry,
buttonModule.default as ContentEntry,
newContentModule.default as ContentEntry
];
Test Locally
npm run dev
# Test your new content in the chat interface
The MCP server provides these tools for AI assistants:
search_design_knowledge
- Search design systems contentsearch_chunks
- Find specific information in content chunksbrowse_by_category
- Browse content by category (components, tokens, etc.)get_all_tags
- Get available content tagscontent/entries/
src/index.ts
to load new contentnpm run dev
http://localhost:8787
Local Testing:
# Test MCP search directly
curl -X POST http://localhost:8787/mcp \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"search_chunks","arguments":{"query":"design tokens"}}}'
# Test AI integration
curl -X POST http://localhost:8787/ai-chat \
-H "Content-Type: application/json" \
-d '{"message":"What are design tokens?"}'
Production Testing:
# Test live MCP endpoint
curl -X POST https://design-systems-mcp.southleft-llc.workers.dev/mcp \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"search_chunks","arguments":{"query":"design tokens"}}}'
# Test live AI integration
curl -X POST https://design-systems-mcp.southleft-llc.workers.dev/ai-chat \
-H "Content-Type: application/json" \
-d '{"message":"What are design tokens?"}'
Login to Cloudflare
npx wrangler login
Set Environment Variables
npx wrangler secret put OPENAI_API_KEY
# Enter your OpenAI API key when prompted
# Optional: Set custom model (current default: gpt-4o)
npx wrangler secret put OPENAI_MODEL
# Enter model name (default: gpt-4o)
Deploy
npm run deploy
Access Your Deployed Server
design-systems-mcp.<your-account>.workers.dev
https://design-systems-mcp.<your-account>.workers.dev
https://design-systems-mcp.<your-account>.workers.dev/mcp
To set up a custom domain like design-systems-mcp.southleft.com
:
design-systems-mcp
Required environment variables:
OPENAI_API_KEY
- Your OpenAI API keyOPENAI_MODEL
- Model to use (default: "gpt-4o")AI_SYSTEM_PROMPT
- Custom system prompt (optional)Add to your MCP configuration file (~/.cursor/mcp.json
for Cursor, or claude_desktop_config.json
for Claude):
Option 1: Use Public Remote Server (Recommended for most users)
{
"mcpServers": {
"design-systems": {
"url": "https://design-systems-mcp.southleft-llc.workers.dev/mcp"
}
}
}
Option 2: Use Local Development Server (For contributors/customization)
{
"mcpServers": {
"design-systems": {
"url": "http://localhost:8787/mcp"
}
}
}
Important Notes:
npm run dev
firstdesign-systems-mcp.southleft-llc.workers.dev/mcp
Any application that supports MCP can connect to the live server:
Endpoint: https://design-systems-mcp.southleft-llc.workers.dev/mcp
Example API Call:
# Initialize connection
curl -X POST https://design-systems-mcp.southleft-llc.workers.dev/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 1,
"method": "initialize",
"params": {
"protocolVersion": "2024-11-05",
"capabilities": {"roots": {"listChanged": true}},
"clientInfo": {"name": "test", "version": "1.0.0"}
}
}'
# Search design systems knowledge
curl -X POST https://design-systems-mcp.southleft-llc.workers.dev/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/call",
"params": {
"name": "search_design_knowledge",
"arguments": {"query": "design tokens"}
}
}'
design-systems-mcp/
โโโ src/
โ โโโ index.ts # Main server with AI integration
โ โโโ lib/
โ โ โโโ content-manager.ts # Content management and search
โ โโโ tools/ # MCP tool definitions
โโโ content/
โ โโโ entries/ # Ingested content (JSON)
โ โโโ raw/ # Raw source files
โโโ scripts/
โ โโโ ingestion/ # Content ingestion scripts
โโโ types/
โ โโโ content.ts # TypeScript definitions
โโโ local-content-library/ # Source PDFs and files
โโโ wrangler.jsonc # Cloudflare Workers config
โโโ .dev.vars # Local environment variables
For bulk content ingestion, you can use CSV files containing multiple URLs:
# Generate a sample CSV template
npm run ingest:csv --sample
Your CSV file should include these columns (header row recommended):
Column | Required | Description |
---|---|---|
url | โ | The URL to fetch content from |
title | โช | Custom title for the content |
category | โช | Content category (general, components, tokens, patterns, guidelines, tools) |
tags | โช | Comma-separated tags |
description | โช | Description of the content |
confidence | โช | Confidence level (low, medium, high) |
system | โช | Design system name |
author | โช | Author or organization |
version | โช | Version information |
url,title,category,tags,description,confidence,system,author,version
https://material.io/components/buttons,Material Design Buttons,components,"button,interaction,material",Material Design button guidelines,high,Material Design,Google,3.0
https://polaris.shopify.com/components/button,Shopify Polaris Button,components,"button,shopify,polaris",Shopify's button component,high,Polaris,Shopify,
https://primer.style/components/button,GitHub Primer Button,components,"button,github,primer",GitHub's button guidelines,high,Primer,GitHub,
# Basic ingestion
npm run ingest:csv my-urls.csv
# With custom options
npm run ingest:csv my-urls.csv --max-concurrent 5 --timeout 60000
# Dry run (validate without fetching)
npm run ingest:csv my-urls.csv --dry-run
# See all options
npm run ingest:csv --help
--max-concurrent <n>
- Process N URLs simultaneously (default: 3)--timeout <ms>
- Request timeout in milliseconds (default: 30000)--retry-attempts <n>
- Number of retry attempts for failed URLs (default: 2)--output-dir <dir>
- Custom output directory (default: content/entries)--delimiter <char>
- CSV delimiter (default: ',')--no-header
- CSV file doesn't have a header rowContent is automatically:
npm run dev
- Start local development servernpm run deploy
- Deploy to Cloudflare Workersnpm run ingest:pdf <file>
- Ingest PDF contentnpm run ingest:url <url>
- Ingest web contentnpm run ingest:csv <file>
- Bulk ingest from CSV file containing URLsnpm run check:duplicates
- Check for duplicate URLs in content entriesCheck for Duplicate URLs:
npm run check:duplicates
This command scans all content entries and identifies any duplicate URLs to maintain content quality. Run this:
The checker will show:
Define tools in src/index.ts
:
server.tool("your_tool_name", schema, async (params) => {
// Tool implementation
});
Add to OpenAI function definitions:
const MCP_TOOLS = [
// ... existing tools
{
type: "function",
function: {
name: "your_tool_name",
description: "Tool description",
parameters: { /* JSON schema */ }
}
}
];
Content not loading:
content/entries/
src/index.ts
Port issues:
wrangler.jsonc
has correct dev port (8787)pkill -f "wrangler dev"
Environment variables:
.dev.vars
filenpx wrangler secret put
# View server logs
npx wrangler tail
# Local development logs
npm run dev
# Check console output for content loading status
This project is free and open source under the MIT License. You are welcome to:
This project compiles design system knowledge from many brilliant creators. All original content remains the intellectual property of their respective authors.
See SECURITY.md for detailed security information and best practices.
We welcome contributions! Whether you want to:
Please:
To contribute new design system content:
This project exists thanks to the generous sharing of knowledge from the design systems community. Special thanks to:
See CREDITS.md for the complete list of contributors and sources.
This project was built from the Cloudflare remote MCP server template. For additional Cloudflare Workers information:
npm create cloudflare@latest -- my-mcp-server --template=cloudflare/ai/demos/remote-mcp-authless
Integrates with the Uberall API to manage business listings, locations, and social media presence.
Integrates with Google Play Store command-line tools, enabling AI assistants to manage apps via the Play Console API.
Interact with Shopify store data such as products, customers, and orders using the GraphQL API.
Get prescriptive CDK advice, explain CDK Nag rules, check suppressions, generate Bedrock Agent schemas, and discover AWS Solutions Constructs patterns.
Interact with the Brex API to manage financial data and resources.
Provides safe, read-only access to Kubernetes cluster resources for debugging and inspection.
Interact with the Alpaca trading API for stock trading, account management, and market data using LLMs.
Query Amazon Bedrock Knowledge Bases using natural language to retrieve relevant information from your data sources.
An unofficial server for interacting with the Yandex Cloud API.
Manage Terraform Cloud infrastructure using natural language via its API.