An MCP server for the DeepSeek API, providing code review, file management, and account management.
A production-grade MCP server integrating with DeepSeek's API, featuring advanced code review capabilities, efficient file management, and API account management.
# Clone and build
git clone https://github.com/your-username/DeepseekMCP
cd DeepseekMCP
go build -o bin/deepseek-mcp
# Stdio Mode (Default - for Claude Desktop)
export DEEPSEEK_API_KEY=your_api_key
export DEEPSEEK_MODEL=deepseek-chat
./bin/deepseek-mcp
# HTTP Mode (for REST API integration)
export DEEPSEEK_API_KEY=your_api_key
export DEEPSEEK_HTTP_ENABLED=true
export DEEPSEEK_HTTP_PORT=8080
./bin/deepseek-mcp
# Or use command-line flags
./bin/deepseek-mcp --http --http-port 8080 --http-host localhost
{
"mcpServers": {
"deepseek": {
"command": "/Your/project/path/bin/deepseek-mcp",
"env": {
"DEEPSEEK_API_KEY": "YOUR_API_KEY",
"DEEPSEEK_MODEL": "deepseek-chat"
}
}
}
}
For web applications, microservices, or API gateways:
# Start HTTP server
./bin/deepseek-mcp --http --http-port 8080
# Test with curl
curl -X POST http://localhost:8080/mcp/tools/call \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/call",
"params": {
"name": "deepseek_ask",
"arguments": {"query": "Explain Go channels"}
}
}'
Variable | Description | Default |
---|---|---|
API Configuration | ||
DEEPSEEK_API_KEY | DeepSeek API key | Required |
DEEPSEEK_MODEL | Model ID from available models | deepseek-chat |
DEEPSEEK_SYSTEM_PROMPT | System prompt for code review | Default code review prompt |
DEEPSEEK_SYSTEM_PROMPT_FILE | Path to file containing system prompt | Empty |
DEEPSEEK_MAX_FILE_SIZE | Max upload size (bytes) | 10485760 (10MB) |
DEEPSEEK_ALLOWED_FILE_TYPES | Comma-separated MIME types | [Common text/code types] |
DEEPSEEK_TIMEOUT | API timeout in seconds | 90 |
DEEPSEEK_MAX_RETRIES | Max API retries | 2 |
DEEPSEEK_INITIAL_BACKOFF | Initial backoff time (seconds) | 1 |
DEEPSEEK_MAX_BACKOFF | Maximum backoff time (seconds) | 10 |
DEEPSEEK_TEMPERATURE | Model temperature (0.0-1.0) | 0.4 |
HTTP Transport Configuration | ||
DEEPSEEK_HTTP_ENABLED | Enable HTTP transport mode | false |
DEEPSEEK_HTTP_PORT | HTTP server port | 8080 |
DEEPSEEK_HTTP_HOST | HTTP server host | localhost |
DEEPSEEK_HTTP_ENDPOINT_PATH | MCP endpoint path | /mcp |
DEEPSEEK_HTTP_STATELESS | Enable stateless mode | true |
Example .env
:
# API Configuration
DEEPSEEK_API_KEY=your_api_key
DEEPSEEK_MODEL=deepseek-chat
DEEPSEEK_SYSTEM_PROMPT="Your custom code review prompt here"
# Alternative: load system prompt from file
# DEEPSEEK_SYSTEM_PROMPT_FILE=/path/to/prompt.txt
DEEPSEEK_MAX_FILE_SIZE=5242880 # 5MB
DEEPSEEK_ALLOWED_FILE_TYPES=text/x-go,text/markdown
DEEPSEEK_TEMPERATURE=0.7
# HTTP Transport (optional)
DEEPSEEK_HTTP_ENABLED=true
DEEPSEEK_HTTP_PORT=8080
DEEPSEEK_HTTP_HOST=localhost
DEEPSEEK_HTTP_ENDPOINT_PATH=/mcp
DEEPSEEK_HTTP_STATELESS=true
Currently, the server provides the following tools:
Used for code analysis, review, and general queries with optional file path inclusion.
{
"name": "deepseek_ask",
"arguments": {
"query": "Review this Go code for concurrency issues...",
"model": "deepseek-chat",
"systemPrompt": "Optional custom review instructions",
"file_paths": ["main.go", "config.go"],
"json_mode": false
}
}
Lists all available DeepSeek models with their capabilities.
{
"name": "deepseek_models",
"arguments": {}
}
Checks your DeepSeek API account balance and availability status.
{
"name": "deepseek_balance",
"arguments": {}
}
Estimates the token count for text or a file to help with quota management.
{
"name": "deepseek_token_estimate",
"arguments": {
"text": "Your text to estimate...",
"file_path": "path/to/your/file.go"
}
}
The following DeepSeek models are supported by default:
Model ID | Description |
---|---|
deepseek-chat | General-purpose chat model balancing performance and efficiency |
deepseek-coder | Specialized model for coding and technical tasks |
deepseek-reasoner | Model optimized for reasoning and problem-solving tasks |
Note: The actual available models may vary based on your API access level. The server will automatically discover and make available all models you have access to through the DeepSeek API.
Extension | MIME Type |
---|---|
.go | text/x-go |
.py | text/x-python |
.js | text/javascript |
.md | text/markdown |
.java | text/x-java |
.c/.h | text/x-c |
.cpp/.hpp | text/x-c++ |
25+ more | (See getMimeTypeFromPath in deepseek.go) |
The server handles files directly through the deepseek_ask
tool:
file_paths
array parameterThis direct file handling approach eliminates the need for separate file upload/management endpoints.
For integrations that require structured data output, the server supports JSON mode:
json_mode: true
in your requestExample with JSON mode:
{
"name": "deepseek_ask",
"arguments": {
"query": "Analyze this code and return a JSON object with: issues_found (array of strings), complexity_score (number 1-10), and recommendations (array of strings)",
"model": "deepseek-chat",
"json_mode": true,
"file_paths": ["main.go", "config.go"]
}
}
This returns a well-formed JSON response that can be parsed directly by your application.
The server supports these command-line options to override environment variables:
# API Configuration overrides
./bin/deepseek-mcp -deepseek-model=deepseek-coder
./bin/deepseek-mcp -deepseek-system-prompt="Your custom prompt here"
./bin/deepseek-mcp -deepseek-temperature=0.8
# HTTP Transport flags
./bin/deepseek-mcp --http --http-port 8080 --http-host localhost
./bin/deepseek-mcp --http --http-port 9000 --http-host 0.0.0.0
When running in HTTP mode, the server provides these MCP endpoints:
Endpoint | Method | Description | Status |
---|---|---|---|
/mcp/initialize | POST | Initialize MCP session | ✅ Tested |
/mcp/tools/list | POST | List available tools | ✅ Tested |
/mcp/tools/call | POST | Execute tool calls | ✅ Tested |
/mcp/resources/list | GET | List resources | Available |
All endpoints use JSON-RPC 2.0 format and have been thoroughly tested with real DeepSeek API integration.
To run tests:
go test -v ./...
golangci-lint run
gofmt -w .
Contributions are welcome! Please feel free to submit a Pull Request.
git checkout -b feature/amazing-feature
)git commit -m 'Add some amazing feature'
)git push origin feature/amazing-feature
)Provides interactive user feedback and command execution for AI-assisted development.
An intelligent codebase search engine that transforms local codebases into a natural language queryable knowledge base.
Control emulators by opening/closing apps, capturing screenshots, and interacting with the screen.
An MCP server with built-in GitHub OAuth support, designed for deployment on Cloudflare Workers.
Access and interact with your Kibana instance using natural language or programmatic requests.
Make your AI agent speak every language on the planet, using Lingo.dev Localization Engine.
An MCP (Model Context Protocol) aggregator that allows you to combine multiple MCP servers into a single endpoint allowing to filter specific tools.
A Python client for connecting to Model Context Protocol (MCP) servers, supporting local scripts and npx packages.
Provides real-time access to Chainlink's decentralized on-chain price feeds.
A local MCP server implementing Retrieval-Augmented Generation (RAG) with sentence window retrieval and support for multiple file types.