Mezmo
Retrieve logs from the Mezmo observability platform.
Mezmo MCP Server
A Model Context Protocol (MCP) server for retrieving logs from Mezmo. Quota-conscious design with intelligent defaults - just add your API key and run!
⚡ Smart Defaults
- Time Range: Last 6 hours (when not specified) - balances quota with finding actual logs
- Log Count: 10 logs per request
- Log Levels: All levels (you control filtering)
Recommended Workflow:
- First, fetch 3-5 logs to discover available apps and log shape
- Then, filter by specific app(s) you're debugging
- Add level filtering for ERROR/WARNING to reduce noise
- Increase count only after filters are in place (e.g., 20-50)
- This approach minimizes quota usage significantly!
🚀 Quick Start
1. Get Your API Key
Get your Mezmo Service API key from the Mezmo dashboard.
2. Run with Docker
# Clone the repository
# (replace with your fork/clone URL)
git clone <your-repo-url>
cd <your-repo-dir>
# Create your local .env (never commit it)
cp env.example .env
# then edit .env and set MEZMO_API_KEY
# Build and run
docker-compose up -d
3. Configure Your MCP Client
For Cursor (add to .cursor/mcp.json):
{
"mcpServers": {
"mezmo": {
"url": "http://localhost:18080/mcp",
"transport": "streamable-http",
"description": "Mezmo log retrieval"
}
}
}
For Claude Desktop (add to MCP settings):
{
"mcpServers": {
"mezmo": {
"command": "docker",
"args": ["exec", "mezmo-mcp-server", "python", "server.py"]
}
}
}
4. Start Using
Restart your MCP client and you'll have access to the get_logs tool!
📋 Usage
The get_logs tool automatically retrieves logs from the last 6 hours when no time range is specified - perfect for debugging while conserving quota.
Step 1: Discover available apps (3-5 logs):
{
"count": 3,
"levels": "ERROR,WARNING"
}
Step 2: Filter by specific app:
{
"count": 10,
"apps": "app-a",
"levels": "ERROR,WARNING"
}
Advanced filtering (scale up only after filters work):
{
"count": 50,
"apps": "app-a,app-b",
"levels": "ERROR,WARNING",
"query": "database connection"
}
Custom time range (use sparingly - impacts quota):
{
"count": 50,
"apps": "app-a",
"from_ts": "1640995200",
"to_ts": "1640998800"
}
💡 Quota-Conscious Tips
- Always filter by app when possible - this drastically reduces results
- Start tiny - use count=3-5 for discovery, then increase if needed
- Add level filtering - specify levels="ERROR,WARNING" to reduce noise
- Use default 6-hour window unless you need wider historical data
🔐 Security / Secrets
- Never commit
.env(it contains yourMEZMO_API_KEY). - Prefer using
.env.exampleas a template and keep your real values local. - If you enable MCP authentication (
MCP_ENABLE_AUTH=true), keepMCP_API_TOKENsecret as well.
🛠️ Commands
docker-compose up -d # Start the server
docker-compose down # Stop the server
docker-compose logs -f # View logs
🐛 Troubleshooting
Container won't start?
- Check your
.envfile hasMEZMO_API_KEY=your_actual_key - View logs:
docker-compose logs
Can't connect from MCP client?
- Ensure container is running:
docker-compose ps - Restart your MCP client after configuration changes
That's it! The server runs on port 18080 and automatically handles time windows, retries, and error handling.
Servidores relacionados
Gumroad
Interact with the Gumroad API to access and manage your products, sales, and creator data.
Kubernetes MCP
A read-only MCP server for retrieving information and diagnosing issues in Kubernetes clusters.
AI Image MCP Server
AI-powered image analysis using OpenAI's Vision API.
Webflow
Interact with Webflow APIs to list and edit your site and CMS data.
CoinGecko Server
An MCP server for accessing real-time cryptocurrency data from the CoinGecko Pro API.
Datadog
Interact with the Datadog API to monitor your cloud infrastructure, applications, and logs.
Shopify MCP Server
Interact with your Shopify store's data using the GraphQL API.
Arc MCP Server
Simplifies framework deployments on various hosting environments, with a focus on shared hosting.
Dacast MCP Live Stream Server
Dacast MCP Live Stream Server connects your AI tools to Dacast’s live streaming and video hosting APIs, so you can create and manage live streams, playlists, thumbnails, and simulcasts using simple natural-language prompts.
Amazon Security Lake
Query Amazon Security Lake data using AWS Athena. Requires AWS credentials for access.