MCP Hangar
Kubernetes-native registry for managing multiple MCP servers with lazy loading, health monitoring, and RBAC
MCP Hangar
Production-grade infrastructure for Model Context Protocol.
MCP Hangar is a control plane for MCP servers. It manages MCP server lifecycle, parallel tool execution, security governance, and observability -- so you don't have to.
Quick Start
30 seconds to working MCP servers:
curl -sSL https://mcp-hangar.io/install.sh | bash && mcp-hangar init -y && mcp-hangar serve
That's it. Filesystem, fetch, and memory MCP servers are now available to Claude.
What just happened?
- Install - Downloaded and installed
mcp-hangarvia pip/uv - Init - Created
~/.config/mcp-hangar/config.yamlwith starter MCP servers - Serve - Started the MCP server (stdio mode for Claude Desktop)
The init -y flag uses sensible defaults:
- Detects available runtimes (uvx preferred, npx fallback)
- Configures starter bundle: filesystem, fetch, memory
- Runs a smoke test to verify MCP servers start correctly
- Updates Claude Desktop config automatically
Manual Setup
# 1. Install
pip install mcp-hangar
# or: uv pip install mcp-hangar
# 2. Initialize with wizard
mcp-hangar init
# 3. Start server
mcp-hangar serve
HTTP Mode
# Start with HTTP transport and REST API
mcp-hangar serve --http --port 8000
# REST API: http://localhost:8000/api/
What It Does
Parallel execution. Your AI agent calls 5 tools sequentially -- each takes 200ms, that's 1 second of waiting. hangar_call runs them in parallel. 200ms total.
hangar_call(calls=[
{"mcp_server": "github", "tool": "search_repos", "arguments": {"query": "mcp"}},
{"mcp_server": "slack", "tool": "post_message", "arguments": {"channel": "#dev"}},
{"mcp_server": "internal-api", "tool": "get_status", "arguments": {}}
])
Single MCP tool call. Parallel execution. All results returned together.
Lifecycle management. Lazy loading, health checks, automatic restart, graceful shutdown. MCP servers start on first use, stay warm while active, shut down after idle TTL.
Single-flight cold starts. When 10 parallel calls hit a cold MCP server, it initializes once -- not 10 times.
Circuit breaker. One failing MCP server doesn't kill your batch. Automatic isolation and recovery.
Configuration
mcp_servers:
github:
mode: subprocess
command: [uvx, mcp-server-github]
env:
GITHUB_TOKEN: ${GITHUB_TOKEN}
slack:
mode: subprocess
command: [uvx, mcp-server-slack]
internal-api:
mode: remote
endpoint: "http://localhost:8080"
custom-server:
mode: docker
image: my-registry/mcp-server:latest
container:
command: ["python", "-m", "custom_entrypoint"]
Claude Desktop Integration
mcp-hangar init auto-configures Claude Desktop. For manual setup, add to your Claude Desktop config:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Linux: ~/.config/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"hangar": {
"command": "mcp-hangar",
"args": ["serve", "--config", "~/.config/mcp-hangar/config.yaml"]
}
}
}
Restart Claude Desktop. Done.
Python API
For programmatic use (scripts, pipelines, custom integrations):
from mcp_hangar import Hangar, HangarConfig
# Async
async with Hangar.from_config("config.yaml") as hangar:
result = await hangar.invoke("math", "add", {"a": 1, "b": 2})
# Sync wrapper
from mcp_hangar import SyncHangar
with SyncHangar.from_config("config.yaml") as hangar:
result = hangar.invoke("math", "add", {"a": 1, "b": 2})
# Programmatic config
config = (
HangarConfig()
.add_mcp_server("math", command=["python", "-m", "math_server"])
.add_mcp_server("fetch", mode="docker", image="mcp/fetch:latest")
.build()
)
hangar = Hangar(config)
Security & Governance (1.0)
- Capability declaration. Declare what each MCP server can access (network, filesystem, environment). Violations are detected and reported.
- Behavioral profiling. Baseline MCP server behavior, detect deviations (new destinations, protocol drift, frequency anomalies). Learning and enforcing modes.
- Tool schema drift detection. Track tool schema changes across MCP server updates.
- Network connection monitoring.
/proc/net/tcpparsing, Docker and Kubernetes monitors with audit events. - RBAC. Role-based access control with tool-level policies. API key and JWT/OIDC authentication.
- Approval gate. Human-in-the-loop approval for sensitive tool calls.
Observability
- OpenTelemetry. Distributed tracing with W3C trace context propagation across MCP servers.
- Prometheus metrics. MCP server state, tool calls, health checks, circuit breaker, concurrency, batch execution.
- Grafana dashboards. Pre-built overview and per-MCP server deep dive dashboards.
- Structured logging. Correlation IDs across parallel calls. JSON log format for production.
- Audit trail. Event-sourced audit log with OTLP export for security-relevant events.
Advanced Configuration
mcp_servers:
fast-mcp-server:
mode: subprocess
command: ["python", "fast.py"]
idle_ttl_s: 300 # Shutdown after 5min idle
health_check_interval_s: 60 # Check health every minute
max_consecutive_failures: 3 # Circuit breaker threshold
max_concurrency: 5 # Per-MCP server concurrency limit
tools:
deny_list: [delete_*] # Tool access filtering
execution:
max_concurrency: 50 # Global concurrency limit
default_mcp_server_concurrency: 10
truncation:
enabled: true
max_batch_size_bytes: 950000 # Under Claude's 1MB limit
config_reload:
enabled: true # Live config reload via file watch
Scales With You
- Home lab: 2 MCP servers, zero config complexity
- Team setup: Shared MCP servers, Docker containers, hot-reload
- Enterprise: 50+ MCP servers, behavioral profiling, RBAC, approval gates, Kubernetes operator
Same API. Same reliability. Different scale.
Documentation
- Getting Started
- Configuration Reference
- REST API Guide
- Observability Setup
- Authentication & RBAC
- Cookbook
License
Core (src/) is MIT licensed. Enterprise features (enterprise/) are BSL 1.1 licensed.
See LICENSE for MIT terms and enterprise/LICENSE.BSL for BSL terms.
Servidores relacionados
Alpha Vantage MCP Server
patrocinadorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Pinelabs MCP Server
The Pine Labs Online MCP Server implements the Model Context Protocol (MCP) to enable seamless integration between Pine Labs’ online payment APIs and AI tools. It allows AI assistants to perform Pine Labs Online API operations, empowering developers to build intelligent, AI-driven payment applications with ease.
BlenderMCP
Integrates with Blender to enable text and image-based 3D model editing using the Model Context Protocol.
OpenMM MCP
AI-native crypto trading server with 13 tools for market data, order execution, grid strategies, and Cardano DeFi across multiple exchanges.
MCP for Dart
A Dart SDK for building MCP servers and clients.
MCP Proxy Server
A proxy server for aggregating and serving multiple MCP resource servers through a single endpoint.
Lean KG
LeanKG: Stop Burning Tokens. Start Coding Lean.
OpenTofu MCP Server
A Model Context Protocol (MCP) server for accessing the OpenTofu Registry.
rxjs-mcp-server
Execute, debug, and visualize RxJS streams directly from AI assistants like Claude.
ForeverVM
Run Python in a code sandbox.
AI Pair Programmer (Ruby)
AI-powered tools for code review, brainstorming, performance analysis, and security review in Ruby.