A structured development workflow for LLM-based coding, including feature clarification, planning, phased development, and progress tracking.
A Model Context Protocol server that implements a structured development workflow for LLM-based coding.
This MCP server helps LLMs build features in an organized, clean, and safe manner by providing:
start_feature_clarification
- Begin the feature clarification processprovide_clarification
- Answer clarification questions about a featuregenerate_prd
- Generate a Product Requirements Document and implementation plancreate_phase
- Create a development phase for a featureadd_task
- Add tasks to a development phaseupdate_phase_status
- Update the status of a phaseupdate_task_status
- Update the completion status of a taskget_next_phase_action
- Get guidance on what to do nextget_document_path
- Get the path of a generated documentsave_document
- Save a document to a specific locationfeature-planning
- A prompt template for planning feature developmentThe server includes a hybrid document storage system that:
Documents are stored in the documents/{featureId}/
directory by default, with filenames based on document type:
documents/{featureId}/prd.md
- Product Requirements Documentdocuments/{featureId}/implementation-plan.md
- Implementation PlanYou can use the save_document
tool to save documents to custom locations:
{
"featureId": "feature-123",
"documentType": "prd",
"filePath": "/custom/path/feature-123-prd.md"
}
To get the path of a document, use the get_document_path
tool:
{
"featureId": "feature-123",
"documentType": "prd"
}
This returns both the path and whether the document has been saved to disk.
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
To use with compatible MCP clients:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"vibe-coder-mcp": {
"command": "/path/to/vibe-coder-mcp/build/mcp-server.js"
}
}
}
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
This server is implemented using the high-level McpServer
class from the Model Context Protocol TypeScript SDK, which simplifies the process of creating MCP servers by providing a clean API for defining resources, tools, and prompts.
import { McpServer, ResourceTemplate } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
// Create an MCP server
const server = new McpServer({
name: "Vibe-Coder",
version: "0.3.0"
});
// Add a resource
server.resource(
"features-list",
"features://list",
async (uri) => ({ /* ... */ })
);
// Add a tool
server.tool(
"start_feature_clarification",
{ /* parameters schema */ },
async (params) => ({ /* ... */ })
);
// Add a prompt
server.prompt(
"feature-planning",
{ /* parameters schema */ },
(params) => ({ /* ... */ })
);
// Start the server
const transport = new StdioServerTransport();
await server.connect(transport);
The Vibe-Coder MCP server is designed to guide the development process through the following steps:
Create crafted UI components inspired by the best 21st.dev design engineers.
A self-hosted MCP Server registry for private AI agents, supporting both PostgreSQL and SQLite databases.
Convert Figma designs into React Native components.
Generate mermaid diagram and chart with AI MCP dynamically.
A comprehensive proxy that combines multiple MCP servers into a single MCP. It provides discovery and management of tools, prompts, resources, and templates across servers, plus a playground for debugging when building MCP servers.
MCP server empowers LLMs to interact with JSON files efficiently. With JSON MCP, you can split, merge, etc.
Run Python in a code sandbox.
Boost security in your dev lifecycle via SAST, SCA, Secrets & IaC scanning with Cycode.
An AI-driven platform for frontend semantic cognition and automation.
Provides a sleep/wait tool to add delays between operations, such as waiting between API calls or testing eventually consistent systems.