An authentication-free, remote MCP server deployable on Cloudflare Workers. Customize tools directly in the source code and deploy via Cloudflare or locally.
This example allows you to deploy a remote MCP server that doesn't require authentication on Cloudflare Workers.
This will deploy your MCP server to a URL like: remote-mcp-server-authless.<your-account>.workers.dev/sse
Alternatively, you can use the command line below to get the remote MCP Server created on your local machine:
npm create cloudflare@latest -- my-mcp-server --template=cloudflare/ai/demos/remote-mcp-authless
To add your own tools to the MCP server, define each tool inside the init()
method of src/index.ts
using this.server.tool(...)
.
You can connect to your MCP server from the Cloudflare AI Playground, which is a remote MCP client:
remote-mcp-server-authless.<your-account>.workers.dev/sse
)You can also connect to your remote MCP server from local MCP clients, by using the mcp-remote proxy.
To connect to your MCP server from Claude Desktop, follow Anthropic's Quickstart and within Claude Desktop go to Settings > Developer > Edit Config.
Update with this configuration:
{
"mcpServers": {
"calculator": {
"command": "npx",
"args": [
"mcp-remote",
"http://localhost:8787/sse" // or remote-mcp-server-authless.your-account.workers.dev/sse
]
}
}
}
Restart Claude and you should see the tools become available.
An MCP server for Reptor/SysReptor that exposes the reptor CLI tool as a programmable service, configured via environment variables.
A Docker Compose-based collection of MCP servers for LLM workflows, featuring centralized configuration and management scripts.
Terragrunt documentation always up to date.
Converts LaTeX mathematical expressions to MathML format using MathJax-node.
An MCP server for the transformer.bee service, configurable via environment variables.
A code observability MCP enabling dynamic code analysis based on OTEL/APM data to assist in code reviews, issues identification and fix, highlighting risky code etc.
An open-source library to connect any LLM to any MCP server, enabling the creation of custom agents with tool access.
An AI-driven platform for frontend semantic cognition and automation.
Work on dataset metadata with MLCommons Croissant validation and creation.
Ingests and serves Swagger/OpenAPI specifications and Postman collections as MCP tools. Requires a config.json for API and authentication setup.