Cloudflare MCP Server Template
A template for deploying a remote, authentication-free MCP server on Cloudflare Workers. Tools are defined directly in the source code.
Design Systems MCP Server
An AI-powered Model Context Protocol (MCP) server that provides intelligent access to design systems knowledge. This server ingests design system documentation (PDFs, web content) and enables AI assistants to provide expert guidance on design systems, components, tokens, and best practices.
๐ Live Demo: https://design-systems-mcp.southleft.com/
Features
- ๐ค AI-Powered Chat Interface - Natural language queries with OpenAI integration
- ๐ Content Ingestion - Supports PDF parsing and web content extraction
- ๐ Vector Search - Semantic understanding using Supabase + OpenAI embeddings
- ๐ฏ Hybrid Search - Combines semantic vectors with keyword matching
- ๐จ Rich Formatting - Markdown rendering with syntax highlighting
- ๐ Cloudflare Workers - Scalable serverless deployment
- ๐งช Local Testing - Full local development environment
- ๐ Public Access - Live MCP server available for external integrations
Live MCP Server
๐ Public Endpoints
Workers Domain: https://design-systems-mcp.southleft-llc.workers.dev
- AI Chat Interface: https://design-systems-mcp.southleft.com/
- MCP Endpoint:
https://design-systems-mcp.southleft-llc.workers.dev/mcp - Health Check:
https://design-systems-mcp.southleft-llc.workers.dev/health
โจ Try It Now
Visit the live demo and ask questions like:
- "What are design tokens and how should I use them?"
- "How do I create accessible button components?"
- "What are the best practices for organizing a design system?"
- "How do components work in design systems?"
Quick Start
Prerequisites
- Node.js (v20.17.0+ or v22.9.0+)
- OpenAI API key (for embeddings and chat)
- Supabase account (for vector search)
- PostgreSQL with pgvector extension
Local Development Setup
-
Clone and Install
git clone https://github.com/southleft/design-systems-mcp.git cd design-systems-mcp npm install -
Configure Environment
cp .env.example .env # Edit .env and add your credentials: # - Supabase URL and keys # - OpenAI API key # - Enable vector search -
Start Development Server
npm run devServer will be available at:
http://localhost:8787 -
Test the AI Chat Interface
- Open
http://localhost:8787in your browser - Try example queries like:
- "What are design tokens?"
- "Where in the design systems handbook is Alicia SEDLOCK mentioned?"
- "What does the Design Systems Handbook say about Wraith, Gemini, and BackstopJS?"
- "How do I implement a design system?"
- Open
Adding Content with Vector Search
-
Setup Database (first time only)
# Create Supabase tables with pgvector npm run setup:database -
Ingest Content with Embeddings
# Add PDFs to local-content-library/ npm run ingest:pdf path/to/your-design-guide.pdf # Or ingest web content npm run ingest:url https://example.com/design-system # Or crawl entire website npm run crawl:website https://example.com --depth 2 # Or bulk ingest from CSV file npm run ingest:csv path/to/urls.csv # Generate embeddings for all content npm run ingest:vectors -
Update Content Loading in
src/index.ts// Add new content files using dynamic imports const [handbookModule, buttonModule, newContentModule] = await Promise.all([ import('../content/entries/8zWJWrDK_bTOv3_KFo30V-pdf-designsystemshandbook-pdf.json'), import('../content/entries/sample-button-guidelines.json'), import('../content/entries/your-new-content.json') ]); const actualEntries = [ handbookModule.default as ContentEntry, buttonModule.default as ContentEntry, newContentModule.default as ContentEntry ]; -
Test Locally
npm run dev # Test your new content in the chat interface
Available Tools
The MCP server provides these tools for AI assistants:
search_design_knowledge- Search design systems contentsearch_chunks- Find specific information in content chunksbrowse_by_category- Browse content by category (components, tokens, etc.)get_all_tags- Get available content tags
Local Testing Workflow
Testing New Content
- Add content files to
content/entries/ - Update
src/index.tsto load new content - Restart dev server:
npm run dev - Test queries in chat interface at
http://localhost:8787 - Verify AI responses are accurate and complete
Testing MCP Tools Directly
Local Testing:
# Test MCP search directly
curl -X POST http://localhost:8787/mcp \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"search_chunks","arguments":{"query":"design tokens"}}}'
# Test AI integration
curl -X POST http://localhost:8787/ai-chat \
-H "Content-Type: application/json" \
-d '{"message":"What are design tokens?"}'
Production Testing:
# Test live MCP endpoint
curl -X POST https://design-systems-mcp.southleft-llc.workers.dev/mcp \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"search_chunks","arguments":{"query":"design tokens"}}}'
# Test live AI integration
curl -X POST https://design-systems-mcp.southleft-llc.workers.dev/ai-chat \
-H "Content-Type: application/json" \
-d '{"message":"What are design tokens?"}'
Deployment
Deploy to Cloudflare Workers
-
Login to Cloudflare
npx wrangler login -
Set Environment Variables
npx wrangler secret put OPENAI_API_KEY # Enter your OpenAI API key when prompted # Optional: Set custom model (current default: gpt-4o) npx wrangler secret put OPENAI_MODEL # Enter model name (default: gpt-4o) -
Deploy
npm run deploy -
Access Your Deployed Server
- Your server will be available at:
design-systems-mcp.<your-account>.workers.dev - Chat interface:
https://design-systems-mcp.<your-account>.workers.dev - MCP endpoint:
https://design-systems-mcp.<your-account>.workers.dev/mcp
- Your server will be available at:
Custom Domain Setup
To set up a custom domain like design-systems-mcp.southleft.com:
- Deploy your worker (see steps above)
- In Cloudflare Dashboard:
- Go to Workers & Pages โ Custom Domains
- Add your custom domain
- Point it to your deployed worker:
design-systems-mcp
- Configure DNS in your domain settings
- Test the endpoints once propagated
Environment Variables
Required environment variables:
OPENAI_API_KEY- Your OpenAI API keyOPENAI_MODEL- Model to use (default: "gpt-4o")AI_SYSTEM_PROMPT- Custom system prompt (optional)
Connect to MCP Clients
Claude Desktop / Cursor
Add to your MCP configuration file (~/.cursor/mcp.json for Cursor, or claude_desktop_config.json for Claude):
Option 1: Use Public Remote Server (Recommended for most users)
{
"mcpServers": {
"design-systems": {
"url": "https://design-systems-mcp.southleft-llc.workers.dev/mcp"
}
}
}
Option 2: Use Local Development Server (For contributors/customization)
{
"mcpServers": {
"design-systems": {
"url": "http://localhost:8787/mcp"
}
}
}
Important Notes:
- โ Both local and remote servers are fully functional
- ๐ Remote server: Always available, no setup required
- ๐ง Local server: Requires running
npm run devfirst - ๐ After updating configuration, restart your MCP client (Cursor/Claude)
Cloudflare AI Playground
- Go to https://playground.ai.cloudflare.com/
- Enter your MCP server URL:
design-systems-mcp.southleft-llc.workers.dev/mcp - Start using design systems tools in the playground!
External Applications
Any application that supports MCP can connect to the live server:
Endpoint: https://design-systems-mcp.southleft-llc.workers.dev/mcp
Example API Call:
# Initialize connection
curl -X POST https://design-systems-mcp.southleft-llc.workers.dev/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 1,
"method": "initialize",
"params": {
"protocolVersion": "2024-11-05",
"capabilities": {"roots": {"listChanged": true}},
"clientInfo": {"name": "test", "version": "1.0.0"}
}
}'
# Search design systems knowledge
curl -X POST https://design-systems-mcp.southleft-llc.workers.dev/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/call",
"params": {
"name": "search_design_knowledge",
"arguments": {"query": "design tokens"}
}
}'
Project Structure
design-systems-mcp/
โโโ src/
โ โโโ index.ts # Main server with AI integration
โ โโโ lib/
โ โ โโโ content-manager.ts # Content management and search
โ โโโ tools/ # MCP tool definitions
โโโ content/
โ โโโ entries/ # Ingested content (JSON)
โ โโโ raw/ # Raw source files
โโโ scripts/
โ โโโ ingestion/ # Content ingestion scripts
โโโ types/
โ โโโ content.ts # TypeScript definitions
โโโ local-content-library/ # Source PDFs and files
โโโ wrangler.jsonc # Cloudflare Workers config
โโโ .dev.vars # Local environment variables
Content Management
Supported Content Types
- PDFs - Design system handbooks, guidelines
- Web Content - Design system documentation sites
- CSV URLs - Bulk ingestion from CSV files containing multiple URLs
- Website Crawling - Recursive crawling of entire websites
- JSON - Pre-processed design system data
CSV Bulk Ingestion
For bulk content ingestion, you can use CSV files containing multiple URLs:
1. Create a CSV File
# Generate a sample CSV template
npm run ingest:csv --sample
2. CSV Format
Your CSV file should include these columns (header row recommended):
| Column | Required | Description |
|---|---|---|
url | โ | The URL to fetch content from |
title | โช | Custom title for the content |
category | โช | Content category (general, components, tokens, patterns, guidelines, tools) |
tags | โช | Comma-separated tags |
description | โช | Description of the content |
confidence | โช | Confidence level (low, medium, high) |
system | โช | Design system name |
author | โช | Author or organization |
version | โช | Version information |
3. Example CSV
url,title,category,tags,description,confidence,system,author,version
https://material.io/components/buttons,Material Design Buttons,components,"button,interaction,material",Material Design button guidelines,high,Material Design,Google,3.0
https://polaris.shopify.com/components/button,Shopify Polaris Button,components,"button,shopify,polaris",Shopify's button component,high,Polaris,Shopify,
https://primer.style/components/button,GitHub Primer Button,components,"button,github,primer",GitHub's button guidelines,high,Primer,GitHub,
4. Ingest Content
# Basic ingestion
npm run ingest:csv my-urls.csv
# With custom options
npm run ingest:csv my-urls.csv --max-concurrent 5 --timeout 60000
# Dry run (validate without fetching)
npm run ingest:csv my-urls.csv --dry-run
# See all options
npm run ingest:csv --help
5. Advanced Options
--max-concurrent <n>- Process N URLs simultaneously (default: 3)--timeout <ms>- Request timeout in milliseconds (default: 30000)--retry-attempts <n>- Number of retry attempts for failed URLs (default: 2)--output-dir <dir>- Custom output directory (default: content/entries)--delimiter <char>- CSV delimiter (default: ',')--no-header- CSV file doesn't have a header row
Website Crawling
For comprehensive ingestion of entire websites, use the website crawler:
Basic Usage
# Crawl a website starting from a URL
npm run crawl:website -- https://material.io/design
# With custom depth and page limit
npm run crawl:website -- https://polaris.shopify.com --max-depth 5 --max-pages 200
# Crawl only specific sections
npm run crawl:website -- https://primer.style --include "/components/" --include "/foundations/"
# Resume a previous crawl
npm run crawl:website -- https://material.io/design --resume
Crawler Options
--max-depth <n>- Maximum crawl depth (default: 3)--max-pages <n>- Maximum number of pages to crawl (default: 100)--delay <ms>- Delay between requests in milliseconds (default: 1000)--follow-external- Follow links to external domains--include <pattern>- Include URLs matching regex pattern (can be used multiple times)--exclude <pattern>- Exclude URLs matching regex pattern (can be used multiple times)--no-robots- Ignore robots.txt--resume- Resume a previous crawl from the same URL--clear- Clear previous crawl progress before starting--report- Generate a crawl report after completion
Advanced Examples
# Exclude certain paths
npm run crawl:website -- https://ant.design --exclude "/changelog" --exclude "/blog"
# Fast crawl with no delay (be careful!)
npm run crawl:website -- https://chakra-ui.com --delay 0 --max-pages 50
# Deep crawl with high limits
npm run crawl:website -- https://design-system.com --max-depth 10 --max-pages 1000 --delay 500
# Generate a report
npm run crawl:website -- https://carbon.ibm.com --report
Key Features
- Automatic Progress Saving: The crawler saves progress every 10 pages and can resume if interrupted
- Respects robots.txt: By default, the crawler respects robots.txt directives
- Smart Link Extraction: Extracts links from HTML content and follows them recursively
- Duplicate Detection: Automatically skips already-visited pages
- Error Handling: Continues crawling even if some pages fail
- Crawl Reports: Generate detailed reports of crawled content
Content Processing
Content is automatically:
- Chunked for optimal search performance
- Tagged and categorized
- Indexed for semantic search
- Made available to AI for intelligent responses
Development
Available Scripts
npm run dev- Start local development servernpm run deploy- Deploy to Cloudflare Workersnpm run ingest:pdf <file>- Ingest PDF contentnpm run ingest:url <url>- Ingest web contentnpm run ingest:csv <file>- Bulk ingest from CSV file containing URLsnpm run crawl:website <url>- Crawl and ingest entire websitesnpm run check:duplicates- Check for duplicate URLs in content entries
Content Quality Assurance
Check for Duplicate URLs:
npm run check:duplicates
This command scans all content entries and identifies any duplicate URLs to maintain content quality. Run this:
- Before deploying new content
- After ingesting new articles
- Periodically to ensure data integrity
The checker will show:
- Total entries scanned
- Number of unique URLs found
- Any duplicates with filenames and titles
- Suggested cleanup commands
Adding New MCP Tools
-
Define tools in
src/index.ts:server.tool("your_tool_name", schema, async (params) => { // Tool implementation }); -
Add to OpenAI function definitions:
const MCP_TOOLS = [ // ... existing tools { type: "function", function: { name: "your_tool_name", description: "Tool description", parameters: { /* JSON schema */ } } } ];
Troubleshooting
Common Issues
Content not loading:
- Check that JSON files exist in
content/entries/ - Verify dynamic import paths in
src/index.ts - Check server logs for loading errors
- Ensure content files are valid JSON format
Port issues:
- Ensure
wrangler.jsonchas correct dev port (8787) - Kill existing processes:
pkill -f "wrangler dev"
Environment variables:
- Local: Use
.dev.varsfile - Production: Set via
npx wrangler secret put
Logs and Debugging
# View server logs
npx wrangler tail
# Local development logs
npm run dev
# Check console output for content loading status
๐ License & Usage
This project is free and open source under the MIT License. You are welcome to:
- โ Use it for personal and commercial projects
- โ Modify and distribute it
- โ Build upon it for your own projects
- โ Share it with your team and community
Content Attribution
This project compiles design system knowledge from many brilliant creators. All original content remains the intellectual property of their respective authors.
- ๐ See CREDITS.md for complete attribution
- ๐ Always link back to original sources when sharing insights
- ๐ Support the original creators by visiting their websites and platforms
๐ Security & Privacy
- No sensitive data is stored - Only public design system knowledge
- Environment variables are secure - API keys use Cloudflare secrets
- Open source and auditable - All code is publicly available
- Privacy-focused - No user data collection beyond basic usage analytics
See SECURITY.md for detailed security information and best practices.
๐ค Contributing
We welcome contributions! Whether you want to:
- ๐ Report bugs or issues
- ๐ก Suggest new features or improvements
- ๐ Add more design system content
- ๐ง Improve the codebase
- ๐ Enhance documentation
Please:
- Check existing issues first
- Open a new issue to discuss your idea
- Submit a pull request with your changes
- Follow our security guidelines
Adding Content
To contribute new design system content:
- Ensure you have permission to share the content
- Follow the ingestion process documented above
- Add proper attribution in CREDITS.md
- Submit a pull request with the new content
๐ Acknowledgments
This project exists thanks to the generous sharing of knowledge from the design systems community. Special thanks to:
- Brad Frost for the foundational Atomic Design methodology
- The Design System Guide team for comprehensive practical resources
- Figma for excellent official documentation
- All the design teams who openly share their experiences and methodologies
- The entire design systems community for fostering knowledge sharing
See CREDITS.md for the complete list of contributors and sources.
๐ Support & Community
- ๐ Issues: GitHub Issues
- ๐ง Security: Report security issues privately to the maintainers
- ๐ Website: Live Demo
Legacy Cloudflare Template Information
This project was built from the Cloudflare remote MCP server template. For additional Cloudflare Workers information:
Original Template Deploy
Command Line Template
npm create cloudflare@latest -- my-mcp-server --template=cloudflare/ai/demos/remote-mcp-authless
Related Servers
Ntropy MCP server
Enrich banking data using the Ntropy API.
ALECS - MCP Server for Akamai
Manage Akamai's edge platform, including properties, DNS, certificates, security, and performance optimization, using AI assistants.
Cost Management MCP
A server for unified cost management across various cloud providers and API services.
ONOS MCP Server
An MCP server for managing ONOS (Open Network Operating System) networks.
Hugging Face
Access the Hugging Face Dataset Viewer API to query, explore, search, and analyze machine learning datasets from the Hugging Face Hub.
commercetools MCP Essentials
An MCP server and toolkit for integrating with the commercetools platform APIs.
MCP Weather Server
Provides hourly weather forecasts using the AccuWeather API.
EdgeOne Geo Location Service
Provides user geolocation data via Tencent EdgeOne Pages Functions, enabling large language models to access location information.
Lokka
A server for the Microsoft Graph and Azure RM APIs to manage Azure and Microsoft 365 tenants with AI.
NASA MCP Server
An MCP server for interacting with various NASA APIs and data sources. Requires a NASA API key.