A Python client for the Model Context Protocol (MCP), an open standard for connecting AI assistants to external data and tools.
A comprehensive Python client implementation for the Model Context Protocol (MCP) - the open standard for connecting AI assistants to external data and tools.
chuk-mcp is a complete Model Context Protocol (MCP) implementation providing both client and server capabilities with a modern, layered architecture. It supports multiple transport protocols, maintains backward compatibility, and implements cutting-edge features including browser-native operation and structured tool outputs.
The Model Context Protocol (MCP) is an open standard that enables AI applications to securely access external data and tools. Instead of every AI app building custom integrations, MCP provides a universal interface for:
Key Benefits:
chuk-mcp
is a production-ready Python implementation that provides:
โ
Comprehensive MCP Protocol Support - Core features including tools, resources, prompts, with advanced features like sampling and completion
โ
Browser-Native Operation - First-of-kind Pyodide/WebAssembly compatibility
โ
Type Safety - Full type annotations with optional Pydantic integration or graceful fallback
โ
Robust Error Handling - Automatic retries, connection recovery, and detailed error reporting
โ
Multi-Transport Architecture - stdio, HTTP, SSE with extensible interface
โ
Version-Aware Features - Automatic protocol negotiation and graceful degradation
โ
Smart Fallback System - Works with or without dependencies
โ
Production Ready - Battle-tested with proper logging, monitoring, and performance optimization
โ
UV Optimized - First-class support for modern Python packaging with UV
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ CLI & Demo Layer โ โ __main__.py, demos
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Client/Server API โ โ High-level abstractions
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Protocol Layer โ โ Messages, types, features
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Transport Layer โ โ stdio, HTTP, SSE
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ Base Layer โ โ Pydantic fallback, config
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Benefits of This Architecture:
chuk_mcp/
โโโ protocol/ # ๐๏ธ Shared protocol layer
โ โโโ types/ # Type definitions and validation
โ โโโ messages/ # Feature-organized messaging
โ โโโ mcp_pydantic_base.py # Type system foundation with fallback
โโโ transports/ # ๐ Transport implementations
โ โโโ stdio/ # Process-based communication
โ โโโ http/ # Modern streamable HTTP
โ โโโ sse/ # Legacy Server-Sent Events
โโโ client/ # ๐ง High-level client API
โโโ server/ # ๐ญ Server framework
UV is the fastest Python package manager. Choose your installation based on your needs:
# ๐ Minimal installation (uses lightweight fallback validation)
uv add chuk-mcp
# ๐ง With Pydantic validation (recommended for production)
uv add chuk-mcp[pydantic]
# ๐ Full features (Pydantic + HTTP transport + all extras)
uv add chuk-mcp[full]
# ๐ ๏ธ Development installation (includes testing and examples)
uv add chuk-mcp[dev]
# Using pip (if UV not available)
pip install chuk-mcp
# With Pydantic support
pip install chuk-mcp[pydantic]
# Full features
pip install chuk-mcp[full]
Option | Dependencies | Use Case | Performance |
---|---|---|---|
chuk-mcp | Core only | Minimal deployments, testing | Fast startup, lightweight validation |
chuk-mcp[pydantic] | + Pydantic | Production use, type safety | Enhanced validation, better errors |
chuk-mcp[full] | + All features | Maximum functionality | Full feature set |
chuk-mcp[dev] | + Dev tools | Development, testing | All tools included |
๐ก Performance Note: The lightweight fallback validation is ~20x slower than Pydantic (0.010ms vs 0.000ms per operation) but still excellent for most use cases. Use
[pydantic]
for high-throughput applications.
# Quick test with UV
uv run python -c "import chuk_mcp; print('โ
chuk-mcp installed successfully')"
# Or test full functionality
uv run --with chuk-mcp[pydantic] python -c "
from chuk_mcp.protocol.mcp_pydantic_base import PYDANTIC_AVAILABLE
print(f'โ
Pydantic available: {PYDANTIC_AVAILABLE}')
"
chuk-mcp
provides comprehensive compliance with the MCP specification across multiple protocol versions:
2025-06-18
(primary support)2025-03-26
(full compatibility)2024-11-05
(backward compatibility)Feature Category | 2024-11-05 | 2025-03-26 | 2025-06-18 | Implementation Status |
---|---|---|---|---|
Core Operations | ||||
Tools (list/call) | โ | โ | โ | โ Complete |
Resources (list/read/subscribe) | โ | โ | โ | โ Complete |
Prompts (list/get) | โ | โ | โ | โ Complete |
Transport | ||||
Stdio | โ | โ | โ | โ Complete |
SSE | โ | โ ๏ธ Deprecated | โ Removed | โ Legacy Support |
HTTP Streaming | โ | โ | โ | โ Complete |
Advanced Features | ||||
Sampling | โ | โ | โ | โ Complete |
Completion | โ | โ | โ | โ Complete |
Roots | โ | โ | โ | โ Complete |
Elicitation | โ | โ | โ | โ Complete |
Quality Features | ||||
Progress Tracking | โ | โ | โ | โ Complete |
Cancellation | โ | โ | โ | โ Complete |
Notifications | โ | โ | โ | โ Complete |
Batching | โ | โ | โ Deprecated | โ Legacy Support |
import anyio
from chuk_mcp import stdio_client, StdioServerParameters
from chuk_mcp.protocol.messages import send_initialize
async def main():
# Demo with minimal echo server (no external dependencies)
server_params = StdioServerParameters(
command="python",
args=["-c", """
import json, sys
init = json.loads(input())
response = {
"id": init["id"],
"result": {
"serverInfo": {"name": "Demo", "version": "1.0"},
"protocolVersion": "2025-06-18",
"capabilities": {}
}
}
print(json.dumps(response))
"""]
)
async with stdio_client(server_params) as (read, write):
result = await send_initialize(read, write)
print(f"โ
Connected to {result.serverInfo.name}")
if __name__ == "__main__":
anyio.run(main)
Run with UV:
uv run --with chuk-mcp[pydantic] python demo.py
import anyio
from chuk_mcp import stdio_client, StdioServerParameters
from chuk_mcp.protocol.messages import send_initialize
async def main():
# Configure connection to an MCP server
server_params = StdioServerParameters(
command="uvx", # Use uvx to run Python tools
args=["mcp-server-sqlite", "--db-path", "example.db"]
)
# Connect and initialize
async with stdio_client(server_params) as (read_stream, write_stream):
# Initialize the MCP session
init_result = await send_initialize(read_stream, write_stream)
if init_result:
print(f"โ
Connected to {init_result.serverInfo.name}")
print(f"๐ Protocol version: {init_result.protocolVersion}")
else:
print("โ Failed to initialize connection")
anyio.run(main)
Test server connectivity instantly:
# Test with quickstart demo
uv run examples/quickstart.py
# Run comprehensive demos
uv run examples/e2e_smoke_test_example.py --demo all
# Test specific server configurations
uv run examples/e2e_smoke_test_example.py --smoke
# Same API across all transports
async with stdio_client(stdio_params) as streams:
response = await send_message(*streams, "ping")
async with http_client(http_params) as streams:
response = await send_message(*streams, "ping") # Same API!
Tools can now return both human-readable text and machine-processable structured data:
# NEW in 2025-06-18: Tools return structured data + schemas
result = await tool_call("analyze_text", {"text": "Hello world"})
# Text summary for humans
print(result.content[0].text) # "Analyzed 2 words, positive sentiment"
# Structured data for machines
data = result.structuredContent[0].data
sentiment_score = data["sentiment"]["score"] # 0.85
word_count = data["statistics"]["word_count"] # 2
This enables AI assistants to process tool outputs programmatically while still providing clear summaries for users.
Tools are functions that AI can execute on your behalf. Examples include file operations, API calls, calculations, or any custom logic.
from chuk_mcp.protocol.messages import send_tools_list, send_tools_call
async def explore_tools(read_stream, write_stream):
# List available tools
tools_response = await send_tools_list(read_stream, write_stream)
for tool in tools_response.get("tools", []):
print(f"๐ง {tool['name']}: {tool['description']}")
# Call a specific tool
result = await send_tools_call(
read_stream, write_stream,
name="execute_sql",
arguments={"query": "SELECT COUNT(*) FROM users"}
)
print(f"๐ Query result: {result}")
Resources are data sources like files, database records, API responses, or any URI-addressable content.
from chuk_mcp.protocol.messages import send_resources_list, send_resources_read
async def explore_resources(read_stream, write_stream):
# Discover available resources
resources_response = await send_resources_list(read_stream, write_stream)
for resource in resources_response.get("resources", []):
print(f"๐ {resource['name']} ({resource.get('mimeType', 'unknown')})")
print(f" URI: {resource['uri']}")
# Read specific resource content
if resources_response.get("resources"):
first_resource = resources_response["resources"][0]
content = await send_resources_read(read_stream, write_stream, first_resource["uri"])
for item in content.get("contents", []):
if "text" in item:
print(f"๐ Content preview: {item['text'][:200]}...")
Prompts are parameterized templates that help generate consistent, high-quality AI interactions.
from chuk_mcp.protocol.messages import send_prompts_list, send_prompts_get
async def use_prompts(read_stream, write_stream):
# List available prompt templates
prompts_response = await send_prompts_list(read_stream, write_stream)
for prompt in prompts_response.get("prompts", []):
print(f"๐ฌ {prompt['name']}: {prompt['description']}")
# Get a prompt with custom arguments
prompt_result = await send_prompts_get(
read_stream, write_stream,
name="analyze_data",
arguments={"dataset": "sales_2024", "metric": "revenue"}
)
# The result contains formatted messages ready for AI
for message in prompt_result.get("messages", []):
print(f"๐ค {message['role']}: {message['content']}")
Create a server_config.json
file to define your MCP servers:
{
"mcpServers": {
"sqlite": {
"command": "uvx",
"args": ["mcp-server-sqlite", "--db-path", "database.db"]
},
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/files"]
},
"github": {
"command": "uvx",
"args": ["mcp-server-github"],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "${GITHUB_TOKEN}"
}
},
"python": {
"command": "uv",
"args": ["run", "--with", "mcp-server-python", "mcp-server-python"],
"env": {
"PYTHONPATH": "/custom/python/path"
}
}
}
}
from chuk_mcp.transports.stdio import stdio_client, StdioServerParameters
from chuk_mcp.protocol.messages import send_initialize
async def connect_configured_server():
# Load server configuration
server_params = StdioServerParameters(
command="uvx",
args=["mcp-server-sqlite", "--db-path", "database.db"]
)
async with stdio_client(server_params) as (read_stream, write_stream):
init_result = await send_initialize(read_stream, write_stream)
print(f"Connected to configured server: {init_result.serverInfo.name}")
Let servers request AI to generate content on their behalf (with user approval):
from chuk_mcp.protocol.messages.sampling import (
send_sampling_create_message,
create_sampling_message
)
async def ai_content_generation(read_stream, write_stream):
# Server can request AI to generate content
messages = [
create_sampling_message("user", "Explain quantum computing in simple terms")
]
result = await send_sampling_create_message(
read_stream, write_stream,
messages=messages,
max_tokens=1000,
temperature=0.7
)
print(f"๐ค AI Generated: {result['content']['text']}")
Provide intelligent autocompletion for tool arguments:
from chuk_mcp.protocol.messages.completion import (
send_completion_complete,
create_resource_reference,
create_argument_info
)
async def smart_completion(read_stream, write_stream):
# Get completion suggestions for a resource argument
response = await send_completion_complete(
read_stream, write_stream,
ref=create_resource_reference("file:///project/data/"),
argument=create_argument_info("filename", "sales_202")
)
completions = response.get("completion", {}).get("values", [])
print(f"๐ก Suggestions: {completions}")
Connect to multiple servers simultaneously:
from chuk_mcp.transports.stdio import stdio_client, StdioServerParameters
from chuk_mcp.protocol.messages import send_tools_list
async def multi_server_task():
"""Process data using multiple MCP servers."""
servers = [
StdioServerParameters(command="uvx", args=["mcp-server-sqlite", "--db-path", "data.db"]),
StdioServerParameters(command="npx", args=["-y", "@modelcontextprotocol/server-filesystem", "/data"]),
]
for i, server_params in enumerate(servers):
async with stdio_client(server_params) as (read_stream, write_stream):
print(f"Processing with server {i+1}")
# Each server can have different capabilities
tools = await send_tools_list(read_stream, write_stream)
print(f" Available tools: {len(tools.get('tools', []))}")
Subscribe to resource changes for live updates:
from chuk_mcp.protocol.messages.resources import send_resources_subscribe
async def live_monitoring(read_stream, write_stream):
# Subscribe to file changes
success = await send_resources_subscribe(
read_stream, write_stream,
uri="file:///project/logs/app.log"
)
if success:
print("๐ก Subscribed to log file changes")
# Handle notifications in your message loop
# (implementation depends on your notification handling)
chuk-mcp
provides robust error handling with automatic retries:
from chuk_mcp.protocol.messages import RetryableError, NonRetryableError
from chuk_mcp.protocol.messages import send_tools_call
async def resilient_operations(read_stream, write_stream):
try:
# Operations automatically retry on transient failures
result = await send_tools_call(
read_stream, write_stream,
name="network_operation",
arguments={"url": "https://api.example.com/data"},
timeout=30.0, # Extended timeout for slow operations
retries=5 # More retries for critical operations
)
except RetryableError as e:
print(f"โ ๏ธ Transient error after retries: {e}")
# Handle gracefully - maybe try alternative approach
except NonRetryableError as e:
print(f"โ Permanent error: {e}")
# Handle definitively - operation cannot succeed
except Exception as e:
print(f"๐จ Unexpected error: {e}")
# Handle unknown errors
# Quick validation
uv run examples/quickstart.py
# Run comprehensive tests
uv run examples/e2e_smoke_test_example.py --demo all
# Validate installation scenarios
uv run diagnostics/installation_scenarios_diagnostic.py
# Test specific functionality
uv run examples/e2e_smoke_test_example.py --smoke
# Performance benchmarks
uv run examples/e2e_smoke_test_example.py --performance
The MCP ecosystem includes servers for popular services:
# Popular Python servers
uv tool install mcp-server-sqlite
uv tool install mcp-server-github
uv tool install mcp-server-postgres
# Or run directly without installation
uv run --with mcp-server-sqlite mcp-server-sqlite --db-path data.db
# Use npx for Node.js servers
npx -y @modelcontextprotocol/server-filesystem /path/to/files
npx -y @modelcontextprotocol/server-brave-search
@modelcontextprotocol/server-filesystem
mcp-server-sqlite
mcp-server-github
mcp-server-gdrive
mcp-server-brave-search
mcp-server-postgres
Find more at: MCP Servers Directory
Want to create your own MCP server? Check out:
mcp
package@modelcontextprotocol/sdk
chuk-mcp
includes built-in performance monitoring:
import logging
# Enable detailed logging for debugging
logging.basicConfig(level=logging.DEBUG)
# Performance is optimized for:
# - Concurrent server connections
# - Efficient message routing
# - Minimal memory allocation
# - Fast JSON serialization
Benchmarks (from smoke tests):
Performance Highlights:
Installation | Startup Time | Validation Speed | Memory Usage | Dependencies |
---|---|---|---|---|
chuk-mcp | < 0.5s | 0.010ms/op | 15MB | Core only |
chuk-mcp[pydantic] | < 1.0s | 0.000ms/op | 25MB | + Pydantic |
chuk-mcp[full] | < 1.5s | 0.000ms/op | 35MB | All features |
chuk-mcp
includes intelligent dependency handling with graceful fallbacks:
# Check validation backend
from chuk_mcp.protocol.mcp_pydantic_base import PYDANTIC_AVAILABLE
if PYDANTIC_AVAILABLE:
print("โ
Using Pydantic for enhanced validation")
print(" โข Better error messages")
print(" โข Faster validation (Rust-based)")
print(" โข Advanced type coercion")
else:
print("๐ฆ Using lightweight fallback validation")
print(" โข Pure Python implementation")
print(" โข No external dependencies")
print(" โข ~20x slower but still fast")
# Force fallback mode for testing
import os
os.environ["MCP_FORCE_FALLBACK"] = "1"
git clone https://github.com/chrishayuk/chuk-mcp
cd chuk-mcp
# Install with development dependencies
uv sync
# Activate the virtual environment
source .venv/bin/activate # Linux/Mac
# or .venv\Scripts\activate # Windows
# Alternative setup with pip
pip install -e ".[dev]"
# Test with fallback validation
UV_MCP_FORCE_FALLBACK=1 uv run examples/quickstart.py
# Test with different Python versions
uv run --python 3.11 examples/quickstart.py
uv run --python 3.12 examples/quickstart.py
uv run diagnostics/installation_scenarios_diagnostic.py
# Start a new MCP client project
uv init my-mcp-client
cd my-mcp-client
# Add chuk-mcp with dependencies
uv add chuk-mcp[pydantic]
# Add development tools
uv add --dev pytest black isort
# Create example
cat > main.py << 'EOF'
import anyio
from chuk_mcp import stdio_client, StdioServerParameters
async def main():
# Your MCP client code here
pass
if __name__ == "__main__":
anyio.run(main)
EOF
Add to your pyproject.toml
:
[tool.uv]
dev-dependencies = [
"chuk-mcp[dev]",
]
[project.scripts]
mcp-client = "my_mcp_client:main"
[tool.uv.scripts]
test-mcp = "uv run examples/quickstart.py"
validate = "uv run diagnostics/installation_scenarios_diagnostic.py"
MIT License - see LICENSE file for details.
chuk-mcp represents a production-ready, comprehensive MCP implementation that:
This implementation sets a new standard for MCP libraries, being both immediately practical for production use and forward-looking for next-generation MCP applications.
Interact with over 100 cryptocurrency exchange APIs using the CCXT library.
A secure command-line interface server for the Model Context Protocol (MCP) that allows AI models to interact with a user's terminal.
An MCP server that provides control over Android devices through ADB. Offers device screenshot capture, UI layout analysis, package management, and ADB command execution capabilities.
Execute secure shell commands from AI assistants and other MCP clients, with configurable security settings.
Interact with MiniMax's powerful APIs for text-to-speech, voice cloning, and video/image generation.
Clojure linter
Evaluates product designs against Dieter Rams' 10 principles of good design.
Generates placeholder images from various providers like placehold.co and lorem-picsum.
Quickly retrieve SVGs using the Iconify API, with no external data files required.
An MCP server that integrates with Ollama to provide tools for file operations, calculations, and text processing. Requires a running Ollama instance.