CodeAlive MCP
Provides semantic code search and codebase interaction features via the CodeAlive API.
CodeAlive MCP: Deepest Context Engine for your projects (especially for large codebases)
Connect your AI assistant to CodeAlive's powerful code understanding platform in seconds!
This MCP (Model Context Protocol) server enables AI clients like Claude Code, Cursor, Claude Desktop, Continue, VS Code (GitHub Copilot), Cline, Codex, OpenCode, Qwen Code, Gemini CLI, Roo Code, Goose, Kilo Code, Windsurf, Kiro, Qoder, and Amazon Q Developer to access CodeAlive's advanced semantic code search and codebase interaction features.
What is CodeAlive?
The most accurate and comprehensive Context Engine as a service, optimized for large codebases, powered by advanced GraphRAG and accessible via MCP. It enriches the context for AI agents like Cursor, Claude Code, Codex, etc., making them 35% more efficient and up to 84% faster.
It's like Context7, but for your (large) codebases.
It allows AI-Coding Agents to:
- Find relevant code faster with semantic search
- Understand the bigger picture beyond isolated files
- Provide better answers with full project context
- Reduce costs and time by removing guesswork
🛠 Available Tools
Once connected, you'll have access to these powerful tools:
get_data_sources- List your indexed repositories and workspacescodebase_search- Semantic code search across your indexed codebase (main/master branch)codebase_consultant- AI consultant with full project expertise
🎯 Usage Examples
After setup, try these commands with your AI assistant:
- "Show me all available repositories" → Uses
get_data_sources - "Find authentication code in the user service" → Uses
codebase_search - "Explain how the payment flow works in this codebase" → Uses
codebase_consultant
Table of Contents
- Quick Start (Remote)
- AI Client Integrations
- Advanced: Local Development
- Community Plugins
- HTTP Deployment (Self-Hosted & Cloud)
- Available Tools
- Usage Examples
- Troubleshooting
- License
🚀 Quick Start (Remote)
The fastest way to get started - no installation required! Our remote MCP server at https://mcp.codealive.ai/api provides instant access to CodeAlive's capabilities.
Step 1: Get Your API Key
- Sign up at https://app.codealive.ai/
- Navigate to MCP & API
- Click "+ Create API Key"
- Copy your API key immediately - you won't see it again!
Step 2: Choose Your AI Client
Select your preferred AI client below for instant setup:
🚀 Quick Start (Agentic Installation)
You may ask your AI agent to install the CodeAlive MCP server for you.
- Copy-Paste the following prompt into your AI agent (remember to insert your API key):
Here is CodeAlive API key: PASTE_YOUR_API_KEY_HERE
Add the CodeAlive MCP server by following the installation guide from the README at https://raw.githubusercontent.com/CodeAlive-AI/codealive-mcp/main/README.md
Find the section "AI Client Integrations" and locate your client (Claude Code, Cursor, Gemini CLI, etc.). Each client has specific setup instructions:
- For Gemini CLI: Use the one-command setup with `gemini mcp add`
- For Claude Code: Use `claude mcp add` with the --transport http flag
- For other clients: Follow the configuration snippets provided
Prefer the Remote HTTP option when available. If API key is not provided above, help me issue a CodeAlive API key first.
Then allow execution.
- Restart your AI agent.
🤖 AI Client Integrations
Option 1: Remote HTTP (Recommended)
claude mcp add --transport http codealive https://mcp.codealive.ai/api --header "Authorization: Bearer YOUR_API_KEY_HERE"
Option 2: Docker (STDIO)
claude mcp add codealive-docker /usr/bin/docker run --rm -i -e CODEALIVE_API_KEY=YOUR_API_KEY_HERE ghcr.io/codealive-ai/codealive-mcp:v0.3.0
Replace YOUR_API_KEY_HERE with your actual API key.
Option 1: Remote HTTP (Recommended)
- Open Cursor → Settings (
Cmd+,orCtrl+,) - Navigate to "MCP" in the left panel
- Click "Add new MCP server"
- Paste this configuration:
{
"mcpServers": {
"codealive": {
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
- Save and restart Cursor
Option 2: Docker (STDIO)
{
"mcpServers": {
"codealive": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}
OpenAI Codex CLI supports MCP via ~/.codex/config.toml.
~/.codex/config.toml (Docker stdio – recommended)
[mcp_servers.codealive]
command = "docker"
args = ["run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"]
Experimental: Streamable HTTP (requires experimental_use_rmcp_client)
Note: Streamable HTTP support requires enabling the experimental Rust MCP client in your Codex configuration.
[mcp_servers.codealive]
url = "https://mcp.codealive.ai/api"
headers = { Authorization = "Bearer YOUR_API_KEY_HERE" }
One command setup (complete):
gemini mcp add --transport http secure-http https://mcp.codealive.ai/api --header "Authorization: Bearer YOUR_API_KEY_HERE"
Replace YOUR_API_KEY_HERE with your actual API key. That's it - no config files needed! 🎉
Option 1: Remote HTTP (Recommended)
- Create/edit
.continue/config.yamlin your project or~/.continue/config.yaml - Add this configuration:
mcpServers:
- name: CodeAlive
type: streamable-http
url: https://mcp.codealive.ai/api
requestOptions:
headers:
Authorization: "Bearer YOUR_API_KEY_HERE"
- Restart VS Code
Option 2: Docker (STDIO)
mcpServers:
- name: CodeAlive
type: stdio
command: docker
args:
- run
- --rm
- -i
- -e
- CODEALIVE_API_KEY=YOUR_API_KEY_HERE
- ghcr.io/codealive-ai/codealive-mcp:v0.3.0
Option 1: Remote HTTP (Recommended)
Note: VS Code supports both Streamable HTTP and SSE transports, with automatic fallback to SSE if Streamable HTTP fails.
- Open Command Palette (
Ctrl+Shift+PorCmd+Shift+P) - Run "MCP: Add Server"
- Choose "HTTP" server type
- Enter this configuration:
{
"servers": {
"codealive": {
"type": "http",
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
- Restart VS Code
Option 2: Docker (STDIO)
Create .vscode/mcp.json in your workspace:
{
"servers": {
"codealive": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}
Note: Claude Desktop remote MCP requires OAuth authentication. Use Docker option for Bearer token support.
Docker (STDIO)
-
Edit your config file:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
- macOS:
-
Add this configuration:
{
"mcpServers": {
"codealive": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}
- Restart Claude Desktop
Option 1: Remote HTTP (Recommended)
- Open Cline extension in VS Code
- Click the MCP Servers icon to configure
- Add this configuration to your MCP settings:
{
"mcpServers": {
"codealive": {
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
- Save and restart VS Code
Option 2: Docker (STDIO)
{
"mcpServers": {
"codealive": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}
Add CodeAlive as a remote MCP server in your opencode.json.
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"codealive": {
"type": "remote",
"url": "https://mcp.codealive.ai/api",
"enabled": true,
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
Qwen Code supports MCP via mcpServers in its settings.json and multiple transports (stdio/SSE/streamable-http). Use streamable-http when available; otherwise use Docker (stdio).
~/.qwen/settings.json (Streamable HTTP)
{
"mcpServers": {
"codealive": {
"type": "streamable-http",
"url": "https://mcp.codealive.ai/api",
"requestOptions": {
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
}
Fallback: Docker (stdio)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "docker",
"args": ["run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"]
}
}
}
Roo Code reads a JSON settings file similar to Cline.
Global config: mcp_settings.json (Roo) or cline_mcp_settings.json (Cline-style)
Option A — Remote HTTP
{
"mcpServers": {
"codealive": {
"type": "streamable-http",
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
Option B — Docker (STDIO)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}
Tip: If your Roo build doesn't honor HTTP headers, use the Docker/STDIO option.
UI path: Settings → MCP Servers → Add → choose Streamable HTTP
Streamable HTTP configuration:
- Name:
codealive - Endpoint URL:
https://mcp.codealive.ai/api - Headers:
Authorization: Bearer YOUR_API_KEY_HERE
Docker (STDIO) alternative:
Add a STDIO extension with:
- Command:
docker - Args:
run --rm -i -e CODEALIVE_API_KEY=YOUR_API_KEY_HERE ghcr.io/codealive-ai/codealive-mcp:v0.3.0
UI path: Manage → Integrations → Model Context Protocol (MCP) → Add Server
HTTP
{
"mcpServers": {
"codealive": {
"type": "streamable-http",
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
STDIO (Docker)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}
File: ~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"codealive": {
"type": "streamable-http",
"serverUrl": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
Note: Kiro does not yet support remote MCP servers natively. Use the
mcp-remoteworkaround to connect to remote HTTP servers.
Prerequisites:
npm install -g mcp-remote
UI path: Settings → MCP → Add Server
Global file: ~/.kiro/settings/mcp.json
Workspace file: .kiro/settings/mcp.json
Remote HTTP (via mcp-remote workaround)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.codealive.ai/api",
"--header",
"Authorization: Bearer ${CODEALIVE_API_KEY}"
],
"env": {
"CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
}
}
}
}
Docker (STDIO)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}
UI path: User icon → Qoder Settings → MCP → My Servers → + Add (Agent mode)
SSE (remote HTTP)
{
"mcpServers": {
"codealive": {
"type": "sse",
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
STDIO (Docker)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}
Q Developer CLI
Config file: ~/.aws/amazonq/mcp.json or workspace .amazonq/mcp.json
HTTP server
{
"mcpServers": {
"codealive": {
"type": "http",
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
STDIO (Docker)
{
"mcpServers": {
"codealive": {
"type": "stdio",
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
"ghcr.io/codealive-ai/codealive-mcp:v0.3.0"
]
}
}
}
Q Developer IDE (VS Code / JetBrains)
Global: ~/.aws/amazonq/agents/default.json
Local (workspace): .aws/amazonq/agents/default.json
Minimal entry (HTTP):
{
"mcpServers": {
"codealive": {
"type": "http",
"url": "https://mcp.codealive.ai/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
},
"timeout": 60000
}
}
}
Use the IDE UI: Q panel → Chat → tools icon → Add MCP Server → choose http or stdio.
Note: JetBrains AI Assistant requires the
mcp-remoteworkaround for connecting to remote HTTP MCP servers.
Prerequisites:
npm install -g mcp-remote
Config file: Settings/Preferences → AI Assistant → Model Context Protocol → Configure
Add this configuration:
{
"mcpServers": {
"codealive": {
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.codealive.ai/api",
"--header",
"Authorization: Bearer ${CODEALIVE_API_KEY}"
],
"env": {
"CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
}
}
}
}
For self-hosted deployments, replace the URL:
{
"mcpServers": {
"codealive": {
"command": "npx",
"args": [
"mcp-remote",
"http://your-server:8000/api",
"--header",
"Authorization: Bearer ${CODEALIVE_API_KEY}"
],
"env": {
"CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
}
}
}
}
See JetBrains MCP Documentation for more details.
🔧 Advanced: Local Development
For developers who want to customize or contribute to the MCP server.
Prerequisites
- Python 3.11+
- uv (recommended) or pip
Installation
# Clone the repository
git clone https://github.com/CodeAlive-AI/codealive-mcp.git
cd codealive-mcp
# Setup with uv (recommended)
uv venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
uv pip install -e .
# Or setup with pip
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -e .
Local Server Configuration
Once installed locally, configure your AI client to use the local server:
Claude Code (Local)
claude mcp add codealive-local /path/to/codealive-mcp/.venv/bin/python /path/to/codealive-mcp/src/codealive_mcp_server.py --env CODEALIVE_API_KEY=YOUR_API_KEY_HERE
Other Clients (Local)
Replace the Docker command and args with:
{
"command": "/path/to/codealive-mcp/.venv/bin/python",
"args": ["/path/to/codealive-mcp/src/codealive_mcp_server.py"],
"env": {
"CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
}
}
Running HTTP Server Locally
# Start local HTTP server
export CODEALIVE_API_KEY="your_api_key_here"
python src/codealive_mcp_server.py --transport http --host localhost --port 8000
# Test health endpoint
curl http://localhost:8000/health
Smithery Installation
Auto-install for Claude Desktop via Smithery:
npx -y @smithery/cli install @CodeAlive-AI/codealive-mcp --client claude
🌐 Community Plugins
Gemini CLI — CodeAlive Extension
Repo: https://github.com/akolotov/gemini-cli-codealive-extension
Gemini CLI extension that wires CodeAlive into your terminal with prebuilt slash commands and MCP config. It includes:
GEMINI.mdguidance so Gemini knows how to use CodeAlive tools effectively- Slash commands:
/codealive:chat,/codealive:find,/codealive:search - Easy setup via Gemini CLI's extension system
Install
gemini extensions install https://github.com/akolotov/gemini-cli-codealive-extension
Configure
# Option 1: .env next to where you run `gemini`
CODEALIVE_API_KEY="your_codealive_api_key_here"
# Option 2: environment variable
export CODEALIVE_API_KEY="your_codealive_api_key_here"
gemini
🚢 HTTP Deployment (Self-Hosted & Cloud)
Deploy the MCP server as an HTTP service for team-wide access or integration with self-hosted CodeAlive instances.
Deployment Options
The CodeAlive MCP server can be deployed as an HTTP service using Docker. This allows multiple AI clients to connect to a single shared instance, and enables integration with self-hosted CodeAlive deployments.
Docker Compose (Recommended)
Create a docker-compose.yml file based on our example:
# Download the example
curl -O https://raw.githubusercontent.com/CodeAlive-AI/codealive-mcp/main/docker-compose.example.yml
mv docker-compose.example.yml docker-compose.yml
# Edit configuration (see below)
nano docker-compose.yml
# Start the service
docker compose up -d
# Check health
curl http://localhost:8000/health
Configuration Options:
-
For CodeAlive Cloud (default):
- Remove
CODEALIVE_BASE_URLenvironment variable (uses defaulthttps://app.codealive.ai) - Clients must provide their API key via
Authorization: Bearer YOUR_KEYheader
- Remove
-
For Self-Hosted CodeAlive:
- Set
CODEALIVE_BASE_URLto your CodeAlive instance URL (e.g.,https://codealive.yourcompany.com) - Clients must provide their API key via
Authorization: Bearer YOUR_KEYheader
- Set
See docker-compose.example.yml for the complete configuration template.
Connecting AI Clients to Your Deployed Instance
Once deployed, configure your AI clients to use your HTTP endpoint:
Claude Code:
claude mcp add --transport http codealive http://your-server:8000/api --header "Authorization: Bearer YOUR_API_KEY_HERE"
VS Code:
code --add-mcp "{\"name\":\"codealive\",\"type\":\"http\",\"url\":\"http://your-server:8000/api\",\"headers\":{\"Authorization\":\"Bearer YOUR_API_KEY_HERE\"}}"
Cursor / Other Clients:
{
"mcpServers": {
"codealive": {
"url": "http://your-server:8000/api",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
Replace your-server:8000 with your actual deployment URL and port.
🐞 Troubleshooting
Quick Diagnostics
-
Test the hosted service:
curl https://mcp.codealive.ai/health -
Check your API key:
curl -H "Authorization: Bearer YOUR_API_KEY" https://app.codealive.ai/api/v1/data_sources -
Enable debug logging: Add
--debugto local server args
Common Issues
- "Connection refused" → Check internet connection
- "401 Unauthorized" → Verify your API key
- "No repositories found" → Check API key permissions in CodeAlive dashboard
- Client-specific logs → See your AI client's documentation for MCP logs
Getting Help
- 📧 Email: support@codealive.ai
- 🐛 Issues: GitHub Issues
📄 License
MIT License - see LICENSE file for details.
Ready to supercharge your AI assistant with deep code understanding?
Get started now →
Related Servers
21st.dev Magic
Create crafted UI components inspired by the best 21st.dev design engineers.
Supervisord MCP
A tool for managing Supervisord processes, integrated with AI agents via the Model Context Protocol (MCP). It offers standardized process control, real-time monitoring, and robust operations.
VideoDB
Server for advanced AI-driven video editing, semantic search, multilingual transcription, generative media, voice cloning, and content moderation.
Quantum Simulator MCP Server
A quantum circuit simulator with noise models and OpenQASM 2.0 support, accessible via the Model Context Protocol (MCP).
Remote MCP Server (Authless)
An example of a remote MCP server deployable on Cloudflare Workers without authentication.
Emcee
An MCP server for any web application with an OpenAPI specification, connecting AI models to external tools and data services.
Windows Command Line MCP Server
Enables AI models to interact with the Windows command-line safely and efficiently.
Azure DevOps
An MCP server for interacting with Azure DevOps agents and queues.
Civil 3D MCP
An MCP server for interacting with Autodesk Civil 3D, requiring a companion plugin and Node.js 18+.
Dan MCP
An example MCP server deployed on Cloudflare Workers without authentication.