Render
Manage your Render.com services, deployments, and infrastructure.
Render MCP Server
Deploy to Render.com directly through AI assistants.
This MCP (Model Context Protocol) server allows AI assistants like Claude to interact with the Render API, enabling deployment and management of services on Render.com.
Features
- List all services in your Render account
- Get details of a specific service
- Deploy services
- Create new services
- Delete services
- Get deployment history
- Manage environment variables
- Manage custom domains
Installation
npm install -g @niyogi/render-mcp
Configuration
- Get your Render API key from Render Dashboard
- Configure the MCP server with your key:
node bin/render-mcp.js configure --api-key=YOUR_API_KEY
Alternatively, you can run node bin/render-mcp.js configure without the --api-key flag to be prompted for your API key.
Usage
Starting the Server
node bin/render-mcp.js start
Checking Configuration
node bin/render-mcp.js config
Running Diagnostics
node bin/render-mcp.js doctor
Note: If you've installed the package globally, you can also use the shorter commands:
render-mcp start
render-mcp config
render-mcp doctor
Using with Different AI Assistants
Using with Cline
-
Add the following to your Cline MCP settings file:
{ "mcpServers": { "render": { "command": "node", "args": ["/path/to/render-mcp/bin/render-mcp.js", "start"], "env": { "RENDER_API_KEY": "your-render-api-key" }, "disabled": false, "autoApprove": [] } } } -
Restart Cline for the changes to take effect
-
You can now interact with Render through Claude:
Claude, please deploy my web service to Render
Using with Windsurf/Cursor
-
Install the render-mcp package:
npm install -g @niyogi/render-mcp -
Configure your API key:
node bin/render-mcp.js configure --api-key=YOUR_API_KEY -
Start the MCP server in a separate terminal:
node bin/render-mcp.js start -
In Windsurf/Cursor settings, add the Render MCP server:
- Server Name: render
- Server Type: stdio
- Command: node
- Arguments: ["/path/to/render-mcp/bin/render-mcp.js", "start"]
-
You can now use the Render commands in your AI assistant
Using with Claude API Integrations
For custom applications using Claude's API directly:
-
Ensure the render-mcp server is running:
node bin/render-mcp.js start -
In your application, when sending messages to Claude via the API, include the MCP server connections in your request:
{ "mcpConnections": [ { "name": "render", "transport": { "type": "stdio", "command": "node", "args": ["/path/to/render-mcp/bin/render-mcp.js", "start"] } } ] } -
Claude will now be able to interact with your Render MCP server
Example Prompts
Here are some example prompts you can use with Claude once the MCP server is connected:
- "List all my services on Render"
- "Deploy my web service with ID srv-123456"
- "Create a new static site on Render from my GitHub repo"
- "Show me the deployment history for my service"
- "Add an environment variable to my service"
- "Add a custom domain to my service"
Development
Building from Source
git clone https://github.com/niyogi/render-mcp.git
cd render-mcp
npm install
npm run build
Running Tests
npm test
License
MIT
関連サーバー
MCP Currency Converter Server
Provides real-time currency conversion and exchange rate data using the Frankfurter API.
Solana Agent Kit MCP Server
Interact with the Solana blockchain using the Solana Agent Kit.
Alpha Vantage
Access real-time and historical stock market data from the Alpha Vantage API.
VixMCP.Ai.Bridge
An MCP server that exposes VMware VIX operations for AI assistants and automation workflows.
Google Ads MCP
Manage Google Ads campaigns and reporting using the Google Ads API.
Vault MCP Server
An MCP server for interacting with the HashiCorp Vault secrets management tool.
Free USDC Transfer
Enables free USDC transfers on the Base network using a Coinbase CDP MPC Wallet.
Cloudways MCP Server
Integrates with the Cloudways API, allowing AI assistants to access and manage Cloudways infrastructure.
MCP2Lambda
A bridge that enables MCP clients and LLMs to access and execute AWS Lambda functions as tools.
Linode MCP Server
Manage Linode cloud infrastructure resources through natural language conversation.