Tempo MCP Server
An MCP server for querying distributed tracing data from Grafana Tempo.
Tempo MCP Server
A Go-based server implementation for the Model Context Protocol (MCP) with Grafana Tempo integration.
Overview
This MCP server allows AI assistants to query and analyze distributed tracing data from Grafana Tempo. It follows the Model Context Protocol to provide tool definitions that can be used by compatible AI clients such as Claude Desktop.
Getting Started
Prerequisites
- Go 1.21 or higher
- Docker and Docker Compose (for local testing)
Building and Running
Build and run the server:
# Build the server
go build -o tempo-mcp-server ./cmd/server
# Run the server
./tempo-mcp-server
Or run directly with Go:
go run ./cmd/server
The server now supports two modes of communication:
- Standard input/output (stdin/stdout) following the Model Context Protocol (MCP)
- HTTP Server with Server-Sent Events (SSE) endpoint for integration with tools like n8n
The default port for the HTTP server is 8080, but can be configured using the SSE_PORT environment variable.
Server Endpoints
When running in HTTP mode, the server exposes the following endpoints:
- SSE Endpoint:
http://localhost:8080/sse- For real-time event streaming - MCP Endpoint:
http://localhost:8080/mcp- For MCP protocol messaging
Docker Support
You can build and run the MCP server using Docker:
# Build the Docker image
docker build -t tempo-mcp-server .
# Run the server
docker run -p 8080:8080 --rm -i tempo-mcp-server
Alternatively, you can use Docker Compose for a complete test environment:
# Build and run with Docker Compose
docker-compose up --build
Project Structure
.
├── cmd/
│ ├── server/ # MCP server implementation
│ └── client/ # Client for testing the MCP server
├── internal/
│ └── handlers/ # Tool handlers
├── pkg/
│ └── utils/ # Utility functions and shared code
└── go.mod # Go module definition
MCP Server
The Tempo MCP Server implements the Model Context Protocol (MCP) and provides the following tools:
Tempo Query Tool
The tempo_query tool allows you to query Grafana Tempo trace data:
- Required parameters:
query: Tempo query string (e.g.,{service.name="frontend"},{duration>1s})
- Optional parameters:
url: The Tempo server URL (default: from TEMPO_URL environment variable or http://localhost:3200)start: Start time for the query (default: 1h ago)end: End time for the query (default: now)limit: Maximum number of traces to return (default: 20)username: Username for basic authentication (optional)password: Password for basic authentication (optional)token: Bearer token for authentication (optional)
Environment Variables
The Tempo query tool supports the following environment variables:
TEMPO_URL: Default Tempo server URL to use if not specified in the requestSSE_PORT: Port for the HTTP/SSE server (default: 8080)
Testing
./run-client.sh tempo_query "{resource.service.name=\\\"example-service\\\"}"
Using with Claude Desktop
You can use this MCP server with Claude Desktop to add Tempo query tools. Follow these steps:
- Build the server or Docker image
- Configure Claude Desktop to use the server by adding it to your Claude Desktop configuration file
Example Claude Desktop configuration:
{
"mcpServers": {
"temposerver": {
"command": "path/to/tempo-mcp-server",
"args": [],
"env": {
"TEMPO_URL": "http://localhost:3200"
},
"disabled": false,
"autoApprove": ["tempo_query"]
}
}
}
For Docker:
{
"mcpServers": {
"temposerver": {
"command": "docker",
"args": ["run", "--rm", "-i", "-e", "TEMPO_URL=http://host.docker.internal:3200", "tempo-mcp-server"],
"disabled": false,
"autoApprove": ["tempo_query"]
}
}
}
The Claude Desktop configuration file is located at:
- On macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - On Windows:
%APPDATA%\Claude\claude_desktop_config.json - On Linux:
~/.config/Claude/claude_desktop_config.json
Using with Cursor
You can also integrate the Tempo MCP server with the Cursor editor. To do this, add the following configuration to your Cursor settings:
{
"mcpServers": {
"tempo-mcp-server": {
"command": "docker",
"args": ["run", "--rm", "-i", "-e", "TEMPO_URL=http://host.docker.internal:3200", "tempo-mcp-server:latest"]
}
}
}
Using with n8n
To use the Tempo MCP server with n8n, you can connect to it using the MCP Client Tool node:
-
Add an MCP Client Tool node to your n8n workflow
-
Configure the node with these parameters:
- SSE Endpoint:
http://your-server-address:8080/sse(replace with your actual server address) - Authentication: Choose appropriate authentication if needed
- Tools to Include: Choose which Tempo tools to expose to the AI Agent
- SSE Endpoint:
-
Connect the MCP Client Tool node to an AI Agent node that will use the Tempo querying capabilities
Example workflow: Trigger → MCP Client Tool (Tempo server) → AI Agent (Claude)
Example Usage
Once configured, you can use the tools in Claude with queries like:
- "Query Tempo for traces with the query
{duration>1s}" - "Find traces from the frontend service in Tempo using query
{service.name=\"frontend\"}" - "Show me the most recent 50 traces from Tempo with
{http.status_code=500}"
License
This project is licensed under the MIT License.
Related Servers
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Remote MCP Server Authless
An example of a remote MCP server deployable on Cloudflare Workers without authentication.
stdout-mcp-server
Captures and manages stdout logs from multiple processes via a named pipe system for real-time debugging and analysis.
Code Sync MCP Server
Hot reload remote containerized Python applications directly from your IDE.
MCP WordPress Server
A server for integrating with the WordPress REST API.
Shipyard
The Shipyard CLI provides an MCP server for agents to manage Shipyard environments directly: by pulling logs, comparing branches, running tests, and stopping/starting environments..
Pprof Analyzer
Analyze Go pprof performance profiles (CPU, heap, goroutine, etc.) and generate flamegraphs.
Python Notebook MCP
Enables AI assistants to interact with local Jupyter notebooks (.ipynb).
Background Process MCP
A server that provides background process management capabilities, enabling LLMs to start, stop, and monitor long-running command-line processes.
Crates MCP Server
Query Rust crates from crates.io and docs.rs. Search for crates, get info, versions, dependencies, and documentation.
Maven Package README MCP Server
Search for and retrieve detailed information, including READMEs and metadata, for Maven packages from Maven Central.