OpenRouter MCP Client for Cursor
An MCP client for Cursor that uses OpenRouter.ai to access multiple AI models. Requires an OpenRouter API key.
OpenRouter MCP Client for Cursor
A Model Context Protocol (MCP) client for Cursor that utilizes OpenRouter.ai to access multiple AI models.
Requirements
- Node.js v18.0.0 or later (important!)
- OpenRouter API key (get one at openrouter.ai/keys)
Features
- Connect to OpenRouter.ai via MCP
- Access multiple AI models from various providers (Google, DeepSeek, Meta, etc.)
- Use MCP transport mechanism to communicate with Cursor
- Cache model information to reduce API calls
- Support for both free and paid models
- Multi-model completion utility to combine results from multiple models
Available Models
This client provides access to all models available on OpenRouter, including:
- Google Gemini 2.5 Pro
- DeepSeek Chat v3
- Meta Llama 3.1
- DeepSeek R1
- Qwen Coder
- Mistral Small 3.1
- And many more!
Quick Installation
The easiest way to install is to use the setup script:
# Clone the repository
git clone https://your-repo-url/openrouter-mcp-client.git
cd openrouter-mcp-client
# Run the installation script
node install.cjs
The script will:
- Help you create a
.envfile with your OpenRouter API key - Install all dependencies
- Build the project
- Provide next steps
Manual Installation
If you prefer to install manually:
# Clone the repository
git clone https://your-repo-url/openrouter-mcp-client.git
cd openrouter-mcp-client
# Install dependencies
npm install
# Copy environment file and edit it with your API key
cp .env.example .env
# Edit .env file with your OpenRouter API key
# Build the project
npm run build
Configuration
Edit the .env file with your OpenRouter API key and default model:
OPENROUTER_API_KEY=your_api_key_here
OPENROUTER_DEFAULT_MODEL=google/gemini-2.5-pro-exp-03-25:free
Get your API key from OpenRouter Keys.
Cursor Integration
To use this client with Cursor, you need to update Cursor's MCP configuration file:
-
Find Cursor's configuration directory:
- Windows:
%USERPROFILE%\.cursor\ - macOS:
~/.cursor/ - Linux:
~/.cursor/
- Windows:
-
Edit or create the
mcp.jsonfile in that directory. Add a configuration like this:
{
"mcpServers": {
"custom-openrouter-client": {
"command": "node",
"args": [
"FULL_PATH_TO/openrouter-mcp-client/dist/index.js"
],
"env": {
"OPENROUTER_API_KEY": "your_api_key_here",
"OPENROUTER_DEFAULT_MODEL": "google/gemini-2.5-pro-exp-03-25:free"
}
}
}
}
Replace FULL_PATH_TO with the actual path to your client installation.
-
Restart Cursor
-
Select the client by:
- Opening Cursor
- Press Ctrl+Shift+L (Windows/Linux) or Cmd+Shift+L (macOS) to open the model selector
- Choose "custom-openrouter-client" from the list
Direct Testing (without MCP)
Uncomment the test functions in src/index.ts to test direct API interaction:
// Uncomment to test the direct API
testDirectApi().catch(console.error);
testMultiModelCompletion().catch(console.error);
Then run:
npm start
Development
# Watch mode for development
npm run dev
Troubleshooting
Node.js Version Requirements
Important: This project requires Node.js v18.0.0 or later. If you're using an older version, you will see EBADENGINE warnings and may encounter errors. To check your Node.js version:
node --version
If you have an older version, download and install the latest LTS version from nodejs.org.
Module System Errors
If you encounter errors related to ES modules vs CommonJS:
- The main codebase uses ES modules (indicated by
"type": "module"in package.json) - The installation script uses CommonJS (with a .cjs extension)
- Make sure to run the installation script with
node install.cjs
Cursor Not Connecting
If Cursor doesn't seem to connect to your client:
- Make sure the path in
mcp.jsonis correct and uses forward slashes - Check that you've built the client with
npm run build - Verify that your OpenRouter API key is correct in the env settings
- Check Cursor logs for any errors
Related Resources
Smithery Deployment
You can deploy this MCP client to Smithery to make it available to various AI agents and applications.
Prerequisites
- GitHub account
- OpenRouter API key
Steps to Deploy
-
Fork this repository to your GitHub account
-
Sign in to Smithery at smithery.ai using your GitHub account
-
Add a new server:
- Click "Add Server" in the Smithery dashboard
- Select your forked repository
- Configure the build settings:
- Set the base directory to the repository root
- Ensure the Dockerfile and smithery.yaml are detected
-
Deploy your server:
- Smithery will automatically build and deploy your MCP server
- Once deployed, users can configure it with their own OpenRouter API key
Using Your Deployed Server
After deployment, users can access your server through the Smithery registry:
- In their MCP client (like Claude, or any other MCP-compatible client), add the server using Smithery's registry
- Configure their OpenRouter API key and preferred default model
- Start using the server to access multiple AI models through OpenRouter
Smithery vs. Local Installation
- Smithery: Easier for distribution to others; no need for users to clone and build the repository
- Local Installation: Better for personal use and development; more control over the environment
Choose the approach that best fits your needs.
Serveurs connexes
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Abstract MCP Server
Caches large tool responses to files and returns compact resource links to save LLM context window space.
Remote MCP Server (Authless)
An example of a remote MCP server without authentication, deployable on Cloudflare Workers.
FAL Imagen 4
Generate high-quality images using Google's Imagen 4 Ultra model via the FAL AI platform.
ndlovu-code-reviewer
Manual code reviews are time-consuming and often miss the opportunity to combine static analysis with contextual, human-friendly feedback. This project was created to experiment with MCP tooling that gives AI assistants access to a purpose-built reviewer. Uses the Gemini cli application to process the reviews at this time and linting only for typescript/javascript apps at the moment. Will add API based calls to LLM's in the future and expand linting abilities. It's also cheaper than using coderabbit ;)
MCP Todo Server
A demo Todo application server built with a clean architecture using MCPServer and JSON Placeholder.
UnityNaturalMCP
An MCP server implementation for the Unity game engine that enables a natural user experience.
Cloudflare MCP Server
An example MCP server designed for easy deployment on Cloudflare Workers, operating without authentication.
Bucket
Flag features, manage company data, and control feature access using Bucket.
Knowledge Graph Memory Server
Enables persistent memory for Claude using a local knowledge graph of entities, relations, and observations.
Icons8 MCP server
Get access to MCP server SVG and MCP server PNG icons in your vibe-coding projects