MCP Hangar
Kubernetes-native registry for managing multiple MCP servers with lazy loading, health monitoring, and RBAC
MCP Hangar
Production-grade infrastructure for Model Context Protocol.
MCP Hangar is a control plane for MCP servers. It manages MCP server lifecycle, parallel tool execution, security governance, and observability -- so you don't have to.
Quick Start
30 seconds to working MCP servers:
curl -sSL https://mcp-hangar.io/install.sh | bash && mcp-hangar init -y && mcp-hangar serve
That's it. Filesystem, fetch, and memory MCP servers are now available to Claude.
What just happened?
- Install - Downloaded and installed
mcp-hangarvia pip/uv - Init - Created
~/.config/mcp-hangar/config.yamlwith starter MCP servers - Serve - Started the MCP server (stdio mode for Claude Desktop)
The init -y flag uses sensible defaults:
- Detects available runtimes (uvx preferred, npx fallback)
- Configures starter bundle: filesystem, fetch, memory
- Runs a smoke test to verify MCP servers start correctly
- Updates Claude Desktop config automatically
Manual Setup
# 1. Install
pip install mcp-hangar
# or: uv pip install mcp-hangar
# 2. Initialize with wizard
mcp-hangar init
# 3. Start server
mcp-hangar serve
HTTP Mode
# Start with HTTP transport and REST API
mcp-hangar serve --http --port 8000
# REST API: http://localhost:8000/api/
What It Does
Parallel execution. Your AI agent calls 5 tools sequentially -- each takes 200ms, that's 1 second of waiting. hangar_call runs them in parallel. 200ms total.
hangar_call(calls=[
{"mcp_server": "github", "tool": "search_repos", "arguments": {"query": "mcp"}},
{"mcp_server": "slack", "tool": "post_message", "arguments": {"channel": "#dev"}},
{"mcp_server": "internal-api", "tool": "get_status", "arguments": {}}
])
Single MCP tool call. Parallel execution. All results returned together.
Lifecycle management. Lazy loading, health checks, automatic restart, graceful shutdown. MCP servers start on first use, stay warm while active, shut down after idle TTL.
Single-flight cold starts. When 10 parallel calls hit a cold MCP server, it initializes once -- not 10 times.
Circuit breaker. One failing MCP server doesn't kill your batch. Automatic isolation and recovery.
Configuration
mcp_servers:
github:
mode: subprocess
command: [uvx, mcp-server-github]
env:
GITHUB_TOKEN: ${GITHUB_TOKEN}
slack:
mode: subprocess
command: [uvx, mcp-server-slack]
internal-api:
mode: remote
endpoint: "http://localhost:8080"
custom-server:
mode: docker
image: my-registry/mcp-server:latest
container:
command: ["python", "-m", "custom_entrypoint"]
Claude Desktop Integration
mcp-hangar init auto-configures Claude Desktop. For manual setup, add to your Claude Desktop config:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Linux: ~/.config/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"hangar": {
"command": "mcp-hangar",
"args": ["serve", "--config", "~/.config/mcp-hangar/config.yaml"]
}
}
}
Restart Claude Desktop. Done.
Python API
For programmatic use (scripts, pipelines, custom integrations):
from mcp_hangar import Hangar, HangarConfig
# Async
async with Hangar.from_config("config.yaml") as hangar:
result = await hangar.invoke("math", "add", {"a": 1, "b": 2})
# Sync wrapper
from mcp_hangar import SyncHangar
with SyncHangar.from_config("config.yaml") as hangar:
result = hangar.invoke("math", "add", {"a": 1, "b": 2})
# Programmatic config
config = (
HangarConfig()
.add_mcp_server("math", command=["python", "-m", "math_server"])
.add_mcp_server("fetch", mode="docker", image="mcp/fetch:latest")
.build()
)
hangar = Hangar(config)
Security & Governance (1.0)
- Capability declaration. Declare what each MCP server can access (network, filesystem, environment). Violations are detected and reported.
- Behavioral profiling. Baseline MCP server behavior, detect deviations (new destinations, protocol drift, frequency anomalies). Learning and enforcing modes.
- Tool schema drift detection. Track tool schema changes across MCP server updates.
- Network connection monitoring.
/proc/net/tcpparsing, Docker and Kubernetes monitors with audit events. - RBAC. Role-based access control with tool-level policies. API key and JWT/OIDC authentication.
- Approval gate. Human-in-the-loop approval for sensitive tool calls.
Observability
- OpenTelemetry. Distributed tracing with W3C trace context propagation across MCP servers.
- Prometheus metrics. MCP server state, tool calls, health checks, circuit breaker, concurrency, batch execution.
- Grafana dashboards. Pre-built overview and per-MCP server deep dive dashboards.
- Structured logging. Correlation IDs across parallel calls. JSON log format for production.
- Audit trail. Event-sourced audit log with OTLP export for security-relevant events.
Advanced Configuration
mcp_servers:
fast-mcp-server:
mode: subprocess
command: ["python", "fast.py"]
idle_ttl_s: 300 # Shutdown after 5min idle
health_check_interval_s: 60 # Check health every minute
max_consecutive_failures: 3 # Circuit breaker threshold
max_concurrency: 5 # Per-MCP server concurrency limit
tools:
deny_list: [delete_*] # Tool access filtering
execution:
max_concurrency: 50 # Global concurrency limit
default_mcp_server_concurrency: 10
truncation:
enabled: true
max_batch_size_bytes: 950000 # Under Claude's 1MB limit
config_reload:
enabled: true # Live config reload via file watch
Scales With You
- Home lab: 2 MCP servers, zero config complexity
- Team setup: Shared MCP servers, Docker containers, hot-reload
- Enterprise: 50+ MCP servers, behavioral profiling, RBAC, approval gates, Kubernetes operator
Same API. Same reliability. Different scale.
Documentation
- Getting Started
- Configuration Reference
- REST API Guide
- Observability Setup
- Authentication & RBAC
- Cookbook
License
Core (src/) is MIT licensed. Enterprise features (enterprise/) are BSL 1.1 licensed.
See LICENSE for MIT terms and enterprise/LICENSE.BSL for BSL terms.
Máy chủ liên quan
Alpha Vantage MCP Server
nhà tài trợAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
BloodHound-MCP
integration that connects BloodHound with AI through MCP, allowing security professionals to analyze Active Directory attack paths using natural language queries instead of Cypher.
XLUXX Trust Layer
Runtime trust scoring for MCP servers. Monitors 15,000+ servers with reliability metrics, drift detection, and fallback recommendations.
Hyper MCP
A fast, secure, and extensible MCP server using WebAssembly plugins.
mcp-nodejs
A Node.js MCP server example for the OpenWeather API, requiring an API key.
Complex plan
Enhance development AI workflows with advanced planning and sequential thinking capabilities.
ActionKit MCP Starter
A demonstration server for ActionKit, providing access to Slack actions via Claude Desktop.
Storybook MCP
A universal MCP server that connects to any Storybook site and extracts documentation in real-time using Playwright. Use it with any AI or client that supports MCP (Model Context Protocol)—Cursor, Claude Desktop, Windsurf, or other MCP hosts.
Python MCP Server for Code Graph Extraction
Extracts and analyzes Python code structures, focusing on import/export relationships.
MCP WordPress Post Server
Manage WordPress posts and upload images directly from file paths.
LetzAI
An MCP server for image generation using the LetzAI API.