Custom MCP Server
A versatile MCP server built with Next.js, providing a range of tools and utilities with Redis state management.
Custom MCP Server ๐ค
A Model Context Protocol (MCP) server built with Next.js, providing useful tools and utilities through both HTTP and Server-Sent Events (SSE) transports.
๐ Features
๐ง Available Tools
- echo - Echo any message back (perfect for testing)
- get-current-time - Get the current timestamp and ISO date
- calculate - Perform basic mathematical calculations safely
๐ Transport Methods
- HTTP Transport (
/mcp) - Stateless HTTP requests (works without Redis) - SSE Transport (
/sse) - Server-Sent Events with Redis for state management
๐ Security Features
- Rate limiting (100 requests per minute)
- Safe mathematical expression evaluation
- Input sanitization and validation
๐โโ๏ธ Quick Start
Prerequisites
- Node.js 18+
- npm or yarn
- Docker (optional, for local Redis)
Setup
-
Clone and install dependencies:
npm install -
Run the automated setup:
npm run setupThis will:
- Create environment configuration
- Set up Redis (Docker) if available
- Start the development server automatically
-
Manual start (alternative):
npm run dev
The server will be available at http://localhost:3000
๐งช Testing
Quick Tests
# Test HTTP transport
npm run test:http
# Test SSE transport (requires Redis)
npm run test:sse
# Test with Claude Desktop protocol
npm run test:stdio
# Comprehensive tool testing
npm run test:tools
Manual Testing
You can test the MCP server manually using curl:
# List available tools
curl -X POST http://localhost:3000/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/list"
}'
# Call the echo tool
curl -X POST http://localhost:3000/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/call",
"params": {
"name": "echo",
"arguments": {
"message": "Hello World!"
}
}
}'
# Calculate an expression
curl -X POST http://localhost:3000/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 3,
"method": "tools/call",
"params": {
"name": "calculate",
"arguments": {
"expression": "15 * 4 + 10"
}
}
}'
๐ง Configuration
Environment Variables
Create a .env.local file:
# Local Redis (Docker)
REDIS_URL=redis://localhost:6379
# Upstash Redis (Production)
UPSTASH_REDIS_REST_URL=your-upstash-url
UPSTASH_REDIS_REST_TOKEN=your-upstash-token
Redis Setup
The server automatically detects and uses Redis in this priority order:
- Upstash Redis (if
UPSTASH_REDIS_REST_URLandUPSTASH_REDIS_REST_TOKENare set) - Local Redis (if
REDIS_URLis set) - No Redis (HTTP transport only)
Local Redis with Docker
# The setup script handles this automatically, but you can also run manually:
docker run -d --name redis-mcp -p 6379:6379 redis:alpine
Upstash Redis (Recommended for Production)
- Create an Upstash Redis database at upstash.com
- Add the connection details to your
.env.local - The server will automatically detect and use it
๐ฅ๏ธ Integration with AI Tools
Claude Desktop
Add to your Claude Desktop configuration (claude_desktop_config.json):
{
"mcpServers": {
"custom-mcp": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://localhost:3000/mcp"
]
}
}
}
Configuration file locations:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
Cursor IDE
For Cursor 0.48.0 or later (direct SSE support):
{
"mcpServers": {
"custom-mcp": {
"url": "http://localhost:3000/sse"
}
}
}
For older Cursor versions:
{
"mcpServers": {
"custom-mcp": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://localhost:3000/mcp"
]
}
}
}
๐ ๏ธ Development
Project Structure
custom-mcp-server/
โโโ app/
โ โโโ [transport]/
โ โ โโโ route.ts # Main MCP server logic
โ โโโ layout.tsx # Root layout
โ โโโ page.tsx # Home page
โโโ lib/
โ โโโ redis.ts # Redis utilities
โโโ scripts/
โ โโโ setup.mjs # Automated setup
โ โโโ test-http-client.mjs # HTTP transport tests
โ โโโ test-sse-client.mjs # SSE transport tests
โ โโโ test-tools.mjs # Comprehensive tool tests
โโโ package.json
โโโ next.config.ts
โโโ README.md
Adding New Tools
- Define the tool in
app/[transport]/route.ts:
const tools = {
// ... existing tools
myNewTool: {
name: "my-new-tool",
description: "Description of what your tool does",
inputSchema: {
type: "object",
properties: {
param1: {
type: "string",
description: "Description of parameter"
}
},
required: ["param1"]
}
}
};
- Add the handler:
const toolHandlers = {
// ... existing handlers
"my-new-tool": async ({ param1 }: { param1: string }) => {
// Your tool logic here
return {
content: [
{
type: "text",
text: `Result: ${param1}`
}
]
};
}
};
Testing Your Changes
# Run all tests
npm run test:tools
# Test specific functionality
npm run test:http
npm run test:sse
๐ API Reference
Tools/List
Get all available tools:
{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/list"
}
Tools/Call
Call a specific tool:
{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/call",
"params": {
"name": "tool-name",
"arguments": {
"param": "value"
}
}
}
๐ Deployment
Vercel (Recommended)
-
Deploy to Vercel:
vercel -
Add environment variables in Vercel dashboard:
UPSTASH_REDIS_REST_URLUPSTASH_REDIS_REST_TOKEN
-
Update your AI tool configurations to use the deployed URL:
https://your-app.vercel.app/mcp https://your-app.vercel.app/sse
Other Platforms
The server is a standard Next.js application and can be deployed to any platform that supports Node.js:
- Netlify
- Railway
- Render
- DigitalOcean App Platform
๐ค Contributing
- Fork the repository
- Create a feature branch:
git checkout -b feature/my-new-feature - Make your changes and add tests
- Run the test suite:
npm run test:tools - Commit your changes:
git commit -am 'Add some feature' - Push to the branch:
git push origin feature/my-new-feature - Submit a pull request
๐ License
MIT License - see LICENSE file for details.
๐ Troubleshooting
Common Issues
Server not starting:
- Check if port 3000 is available
- Ensure all dependencies are installed:
npm install
Redis connection issues:
- Verify Docker is running:
docker ps - Check Redis container status:
docker ps -a | grep redis-mcp - Restart Redis:
docker restart redis-mcp
AI tool not detecting server:
- Ensure the server is running and accessible
- Check the configuration file syntax (valid JSON)
- Restart your AI tool after configuration changes
- Verify the server URL is correct
Tool calls failing:
- Check server logs for error messages
- Test tools manually with
npm run test:tools - Verify the tool parameters match the expected schema
Debug Mode
Enable debug logging by setting the environment variable:
DEBUG=1 npm run dev
๐ Support
- Create an issue on GitHub for bug reports
- Check existing issues for common problems
- Review the test scripts for usage examples
Related Servers
Terraform MCP Server
Integrates with Terraform Registry APIs for Infrastructure as Code development, supporting provider and module discovery.
Text-To-GraphQL
MCP server for text-to-graphql, integrates with Claude Desktop and Cursor.
MCP Node.js Debugger
Provides runtime debugging access to Node.js applications for code editors like Cursor or Claude Code.
Limetest
A lightweight, AI-powered end-to-end testing framework for CI workflows. Requires an OpenAI API key.
AppSignal MCP
Integrate with the AppSignal monitoring API to query and fetch error and performance data.
Percepta MCP Server
An AI-driven platform for frontend semantic cognition and automation.
Ollama MCP Server
Integrate local Ollama LLM instances with MCP-compatible applications.
MCP Servers
A collection of reference implementations for the Model Context Protocol (MCP), demonstrating how to give LLMs secure access to tools and data using Typescript and Python SDKs.
HAL (HTTP API Layer)
An MCP server that enables Large Language Models to make HTTP requests and interact with web APIs. It supports automatic tool generation from OpenAPI/Swagger specifications.
PydanticRPC
A Python library for building gRPC/ConnectRPC services with Pydantic models, featuring automatic protobuf generation and AI assistant tool exposure.