Codebase MCP Server
A server for secure and efficient codebase analysis.
Codebase MCP Server
Model Context Protocol server for secure and efficient Codebase analysis
Key Features • Supported Languages • Tools • QuickStart • Build • Contributing
🌟 Key Features
- Secure Access: Restricts file operations to predefined root directory.
- Efficient File Management: Provides tools for reading and searching files.
- Detailed Metadata: Retrieves comprehensive file metadata including size, creation time, last modified time, permissions, and type.
- Dependency Analysis: Traverses and analyzes dependency trees within projects.
📋 Supported Languages
- JavaScript/TypeScript
- CSS / CSS Preprocessors
🛠️ Tools
get-project-basics: Retrieves essential project information including package.json details, directory structure.search-config-files: Searches for configuration files within the root directory and returns their paths.get-dependency-tree: Traverses the dependency tree based on the given file path and root directory, and returns the traversal results.list-directory: Lists the contents of a specified directory, distinguishing between files and directories.read-file-with-metadata: Reads the content of a specified file and retrieves its metadata.
🚀 QuickStart
Prerequisites
- Node.js v18 or later
- Codebase to communicate with LLM
Installation
First, install the Codebase MCP server with your client. A typical configuration looks like this:
{ "mcpServers": { "Codebase": { "command": "npx", "args": ["codebase-mcp-server@latest", "/path/to/your/codebase"] } } }
Or, you can install the Codebase MCP server with Docker.
After cloning this repository, build the Docker image:
docker build -t mcp/codebase -f .
And then add the following to your MCP servers file:
{ "mcpServers": { "Codebase": { "command": "docker", "args": [ "run", "-i", "--rm", "--mount", "type=bind,src=/path/to/your/codebase/dir,dst=/projects/path/to/your/codebase/dir,ro", "--mount", "type=bind,src=/path/to/some/file.txt,dst=/projects/path/to/some/file.txt", "mcp/codebase", "/projects" ] } } }
Install in VS CodeYou can install the Codebase MCP server using the VS Code CLI:
For VS Code
code --add-mcp '{"name":"Codebase","command":"npx","args":["codebase-mcp-server@latest"]}'
After installation, the Codebase MCP server will be available for use with your GitHub Copilot agent in VS Code.
Install in CursorGo to Cursor Settings -> MCP -> Add new MCP Server. Use following configuration:
{ "mcpServers": { "Codebase": { "command": "npx", "args": ["codebase-mcp-server@latest"] } } }
Install in Windsurf
Follow Windsuff MCP documentation. Use following configuration:
Install in Claude Desktop
Follow the MCP install guide, use following configuration:
🔧 Build
Local Development Build
Install dependencies
npm ci
Build TypeScript
npm run build
Docker Build
Build Docker image
docker build -t mcp/codebase -f .
Or with specific tag
docker build -t mcp/codebase:latest -f .
🤝 Contributing
Contributions are welcome! Please read our Contributing Guide for details on our Code of conduct and the process for submitting pull requests.
Verwandte Server
Alpha Vantage MCP Server
SponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Docker
Run and manage docker containers, docker compose, and logs
Data Engineering Tutor MCP Server
A tutor for Data Engineering that provides personalized updates on concepts, patterns, and technologies.
Symphony of One
An MCP server for orchestrating multiple Claude instances to collaborate in a shared workspace with real-time communication.
Figma
Access and interact with Figma files and prototypes directly from AI agents.
SYKE - AI Code Impact Analysis
Live dependency graph and impact analysis MCP server for AI coding agents. Runs PASS/WARN/FAIL build gates before code changes to prevent cascade failures. Supports TS, Python, Dart, Go, Rust, Java, C++, Ruby.
Scout Monitoring MCP
Scout's official MCP pipes error, trace and metric data from production to your AI agent
Image Generation
Generate images from text prompts using the Together AI API.
Forge
GPU kernel optimization - 32 swarm agents turn PyTorch into fast CUDA/Triton kernels on real datacenter GPUs with up to 14x speedup
ChuckNorris
A specialized MCP gateway for LLM enhancement prompts and jailbreaks with dynamic schema adaptation. Provides prompts for different LLMs using an enum-based approach.
Scorecard
Access Scorecard's AI model evaluation and testing tools via a Cloudflare Workers deployment.