OpenRouter MCP Client for Cursor
An MCP client for Cursor that uses OpenRouter.ai to access multiple AI models. Requires an OpenRouter API key.
OpenRouter MCP Client for Cursor
A Model Context Protocol (MCP) client for Cursor that utilizes OpenRouter.ai to access multiple AI models.
Requirements
- Node.js v18.0.0 or later (important!)
- OpenRouter API key (get one at openrouter.ai/keys)
Features
- Connect to OpenRouter.ai via MCP
- Access multiple AI models from various providers (Google, DeepSeek, Meta, etc.)
- Use MCP transport mechanism to communicate with Cursor
- Cache model information to reduce API calls
- Support for both free and paid models
- Multi-model completion utility to combine results from multiple models
Available Models
This client provides access to all models available on OpenRouter, including:
- Google Gemini 2.5 Pro
- DeepSeek Chat v3
- Meta Llama 3.1
- DeepSeek R1
- Qwen Coder
- Mistral Small 3.1
- And many more!
Quick Installation
The easiest way to install is to use the setup script:
# Clone the repository
git clone https://your-repo-url/openrouter-mcp-client.git
cd openrouter-mcp-client
# Run the installation script
node install.cjs
The script will:
- Help you create a
.envfile with your OpenRouter API key - Install all dependencies
- Build the project
- Provide next steps
Manual Installation
If you prefer to install manually:
# Clone the repository
git clone https://your-repo-url/openrouter-mcp-client.git
cd openrouter-mcp-client
# Install dependencies
npm install
# Copy environment file and edit it with your API key
cp .env.example .env
# Edit .env file with your OpenRouter API key
# Build the project
npm run build
Configuration
Edit the .env file with your OpenRouter API key and default model:
OPENROUTER_API_KEY=your_api_key_here
OPENROUTER_DEFAULT_MODEL=google/gemini-2.5-pro-exp-03-25:free
Get your API key from OpenRouter Keys.
Cursor Integration
To use this client with Cursor, you need to update Cursor's MCP configuration file:
-
Find Cursor's configuration directory:
- Windows:
%USERPROFILE%\.cursor\ - macOS:
~/.cursor/ - Linux:
~/.cursor/
- Windows:
-
Edit or create the
mcp.jsonfile in that directory. Add a configuration like this:
{
"mcpServers": {
"custom-openrouter-client": {
"command": "node",
"args": [
"FULL_PATH_TO/openrouter-mcp-client/dist/index.js"
],
"env": {
"OPENROUTER_API_KEY": "your_api_key_here",
"OPENROUTER_DEFAULT_MODEL": "google/gemini-2.5-pro-exp-03-25:free"
}
}
}
}
Replace FULL_PATH_TO with the actual path to your client installation.
-
Restart Cursor
-
Select the client by:
- Opening Cursor
- Press Ctrl+Shift+L (Windows/Linux) or Cmd+Shift+L (macOS) to open the model selector
- Choose "custom-openrouter-client" from the list
Direct Testing (without MCP)
Uncomment the test functions in src/index.ts to test direct API interaction:
// Uncomment to test the direct API
testDirectApi().catch(console.error);
testMultiModelCompletion().catch(console.error);
Then run:
npm start
Development
# Watch mode for development
npm run dev
Troubleshooting
Node.js Version Requirements
Important: This project requires Node.js v18.0.0 or later. If you're using an older version, you will see EBADENGINE warnings and may encounter errors. To check your Node.js version:
node --version
If you have an older version, download and install the latest LTS version from nodejs.org.
Module System Errors
If you encounter errors related to ES modules vs CommonJS:
- The main codebase uses ES modules (indicated by
"type": "module"in package.json) - The installation script uses CommonJS (with a .cjs extension)
- Make sure to run the installation script with
node install.cjs
Cursor Not Connecting
If Cursor doesn't seem to connect to your client:
- Make sure the path in
mcp.jsonis correct and uses forward slashes - Check that you've built the client with
npm run build - Verify that your OpenRouter API key is correct in the env settings
- Check Cursor logs for any errors
Related Resources
Smithery Deployment
You can deploy this MCP client to Smithery to make it available to various AI agents and applications.
Prerequisites
- GitHub account
- OpenRouter API key
Steps to Deploy
-
Fork this repository to your GitHub account
-
Sign in to Smithery at smithery.ai using your GitHub account
-
Add a new server:
- Click "Add Server" in the Smithery dashboard
- Select your forked repository
- Configure the build settings:
- Set the base directory to the repository root
- Ensure the Dockerfile and smithery.yaml are detected
-
Deploy your server:
- Smithery will automatically build and deploy your MCP server
- Once deployed, users can configure it with their own OpenRouter API key
Using Your Deployed Server
After deployment, users can access your server through the Smithery registry:
- In their MCP client (like Claude, or any other MCP-compatible client), add the server using Smithery's registry
- Configure their OpenRouter API key and preferred default model
- Start using the server to access multiple AI models through OpenRouter
Smithery vs. Local Installation
- Smithery: Easier for distribution to others; no need for users to clone and build the repository
- Local Installation: Better for personal use and development; more control over the environment
Choose the approach that best fits your needs.
संबंधित सर्वर
Scout Monitoring MCP
प्रायोजकPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
प्रायोजकAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
mcpo+OpenWebUI
A secure MCP-to-OpenAPI proxy server that converts MCP tools into OpenAPI compatible HTTP servers, with support for multiple server types and automatic API documentation.
ContextKeeper
Provides perfect memory for AI-assisted development by capturing project context snapshots, enabling natural language search, evolution tracking, and code intelligence.
RunwayML + Luma AI
Interact with the RunwayML and Luma AI APIs for video and image generation tasks.
Skills-ContextManager
Don’t pollute your AI agent’s context with 1,000 skills. Use Skills-ContextManager, a self-hosted web UI for managing AI skills and workflows by providing skills to an AI agent via MCP only when needed. Simply add skills to your library and enable or disable them with a toggle. Choose whether a skill is always loaded into context or dynamically activated when the AI agent determines it’s needed.
Cloudflare MCP Server
An example MCP server designed for easy deployment on Cloudflare Workers, operating without authentication.
Whistle MCP Server
Manage local Whistle proxy servers through AI assistants, simplifying network debugging, API testing, and rule management.
OpenAPI to MCP
A Go tool for converting OpenAPI specifications into MCP tools.
MicroShift Test Analyzer
Analyzes MicroShift test failures from Google Sheets to correlate them with specific MicroShift versions.
Tableau MCP
A suite of tools for developers to build AI applications that integrate with Tableau.
symbolica-mcp
A scientific computing server for symbolic math, data analysis, and visualization using popular Python libraries like NumPy, SciPy, and Pandas.