AI Development Assistant MCP Server
An AI assistant for development tasks, including taking screenshots, architecting solutions, and performing code reviews.
🤖 AI Development Assistant MCP Server
Welcome to your AI-powered development toolkit, designed as a Model Context Protocol (MCP) server for Cursor! This project provides intelligent coding assistance through custom AI tools. Note that this is mostly a tutorial demo, and not a production-ready tool.
✨ Features
🎨 Code Architect
Call advanced reasoning LLMs to generate plans and instructions for coding agents.
📸 Screenshot Buddy
Take UI design screenshots and use them with the composer agent.
🔍 Code Review
Use git diffs to trigger code reviews.
📄 Read file & Read multiple files
Single-file reading enables efficient data analysis; multi-file reading facilitates bulk data processing.
🚀 Getting Started
1. Environment Setup
First, you'll need to set up your environment variables. Create a file at src/env/keys.ts:
export const OPENAI_API_KEY = "your_key_here";
// Add any other keys you need
⚠️ Security Note: Storing API keys directly in source code is not recommended for production environments. This is only for local development and learning purposes. You can set the env var inline in the Cursor MCP interface as well.
2. Installation
npm install
# or
yarn install
3. Build the Server
npm run build
4. Open Windsurf Chat and Configure MCP
This project is designed to be used as an MCP server in Cursor. Here's how to set it up:
- Open Windsurf on your system.
- Navigate to the Chat section.
- Click
+ Configure MCP(this allows you to add a new MCP server). - Add the following JSON configuration:
{
"mcpServers": {
"mcp-server": {
"command": "node",
"args": [
"D:\\mpc-server\\build\\index.js"
]
}
}
}
📘 Pro Tip: You might need to use the full path to your project's built index.js file.
After adding the server, you should see your tools listed under "Available Tools". If not, try clicking the refresh button in the top right corner of the MCP server section.
For more details about MCP setup, check out the Windsurf MCP Documentation.
🛠️ Using the Tools
Once configured, you can use these tools directly in Cursor's Composer. The AI will automatically suggest using relevant tools, or you can explicitly request them by name or description.
For example, try typing in Composer:
- "Review this code for best practices"
- "Help me architect a new feature"
- "Analyze this UI screenshot"
- "Read single file & Read multiple files"
The agent will ask for your approval before making any tool calls.
📘 Pro Tip: You can update your .cursorrules file with instructions on how to use the tools for certain scenarios, and the agent will use the tools automatically.
📁 Project Structure
src/
├── tools/
│ ├── architect.ts # Code structure generator
│ ├── screenshot.ts # Screenshot analysis tool
│ ├── fileReader.ts # read file & read multiple files tool
│ └── codeReview.ts # Code review tool
├── env/
│ └── keys.ts # Environment configuration (add your API keys here!)
└── index.ts # Main entry point
相關伺服器
Scout Monitoring MCP
贊助Put performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
贊助Access financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Remote MCP Server (Authless)
An authentication-free, remote MCP server designed for deployment on Cloudflare Workers or local setup via npm.
Recraft AI
Generate and edit raster/vector images, vectorize, remove/replace backgrounds, and upscale using the Recraft AI API.
Code Scanner Server
Scans code files for definitions, respects .gitignore, and outputs in LLM-friendly formats like XML or Markdown.
MCP Jupyter Complete
A server for Jupyter notebook manipulation with position-based operations and VS Code integration.
DIY MCP
A from-scratch implementation of the Model Context Protocol (MCP) for building servers and clients, using a Chinese tea collection as an example.
Chronos Protocol
A robust MCP server that eliminates temporal blindness in AI coding agents through intelligent time tracking, persistent memory, and complete session traceability.
MCP Gemini CLI
Integrate with Google Gemini through its command-line interface (CLI).
llm-context
Share code context with LLMs via Model Context Protocol or clipboard.
Apifox
A TypeScript MCP server to access Apifox API data via Stdio.
Glif
Run AI workflows from glif.app using the Glif MCP server.