A remote MCP server example deployable on Cloudflare Workers without authentication.
This example allows you to deploy a remote MCP server that doesn't require authentication on Cloudflare Workers.
This will deploy your MCP server to a URL like: remote-mcp-server-authless.<your-account>.workers.dev/sse
Alternatively, you can use the command line below to get the remote MCP Server created on your local machine:
npm create cloudflare@latest -- my-mcp-server --template=cloudflare/ai/demos/remote-mcp-authless
To add your own tools to the MCP server, define each tool inside the init()
method of src/index.ts
using this.server.tool(...)
.
You can connect to your MCP server from the Cloudflare AI Playground, which is a remote MCP client:
remote-mcp-server-authless.<your-account>.workers.dev/sse
)You can also connect to your remote MCP server from local MCP clients, by using the mcp-remote proxy.
To connect to your MCP server from Claude Desktop, follow Anthropic's Quickstart and within Claude Desktop go to Settings > Developer > Edit Config.
Update with this configuration:
{
"mcpServers": {
"calculator": {
"command": "npx",
"args": [
"mcp-remote",
"http://localhost:8787/sse" // or remote-mcp-server-authless.your-account.workers.dev/sse
]
}
}
}
Restart Claude and you should see the tools become available.
Interact with Alpaca's Trading API for stocks, options, portfolios, and real-time market data using LLMs.
A natural language interface for cell-cell communication analysis using the Liana framework.
Execute MATLAB scripts and functions via MCP clients. Requires a local MATLAB installation.
A Retrieval-Augmented Generation (RAG) server for document processing, vector storage, and intelligent Q&A, powered by the Model Context Protocol.
Execute terminal commands for malware analysis. Requires Node.js 18 or higher.
A server for securely executing commands on the host system, requiring Java 21 or higher.
A Model Context Protocol server that provides access to the connpass users API v2, utilizing Gemini for grounding.
A server for integrating with the Google Gemini CLI to perform AI-powered tasks.
An SSE-based MCP server that allows LLM-powered applications to interact with OCI registries. It provides tools for retrieving information about container images, listing tags, and more.
An AI-driven platform for frontend semantic cognition and automation.