Honeybadger
Interact with the Honeybadger API for error monitoring and reporting using LLMs.
Honeybadger MCP Server
A Model Context Protocol (MCP) server implementation for interacting with the Honeybadger API. This server allows AI agents to fetch and analyze error data from your Honeybadger projects.
Overview
This MCP server provides a bridge between AI agents and the Honeybadger error monitoring service. It follows the best practices laid out by Anthropic for building MCP servers, allowing seamless integration with any MCP-compatible client.
Features
The server provides two essential tools for interacting with Honeybadger:
-
list_faults: List and filter faults from your Honeybadger project- Search by text query
- Filter by creation or occurrence timestamps
- Sort by frequency or recency
- Paginate results
-
get_fault_details: Get detailed information about specific faults- Filter notices by creation time
- Paginate through notices
- Results ordered by creation time descending
Prerequisites
- Python 3.10+
- Honeybadger API key and Project ID
- Docker if running the MCP server as a container (recommended)
Installation
Using uv
-
Install uv if you don't have it:
pip install uv -
Clone this repository:
git clone https://github.com/bobtista/honeybadger-mcp.git cd honeybadger-mcp -
Install dependencies:
uv pip install -e . -
Install development dependencies (optional):
uv pip install -e ".[dev]" -
Create your environment file:
cp .env.example .env # Edit .env with your configuration
Using Docker (Recommended)
-
Build the Docker image:
docker build -t honeybadger/mcp --build-arg PORT=8050 . -
Create a
.envfile and configure your environment variables
Configuration
You can configure the server using either environment variables or command-line arguments:
| Option | Env Variable | CLI Argument | Default | Description |
|---|---|---|---|---|
| API Key | HONEYBADGER_API_KEY | --api-key | Required | Your Honeybadger API key |
| Project ID | HONEYBADGER_PROJECT_ID | --project-id | Required | Your Honeybadger project ID |
| Transport | TRANSPORT | --transport | sse | Transport protocol (sse or stdio) |
| Host | HOST | --host | 127.0.0.1 | Host to bind to when using SSE transport |
| Port | PORT | --port | 8050 | Port to listen on when using SSE transport |
| Log Level | LOG_LEVEL | --log-level | INFO | Logging level (INFO, DEBUG, etc.) |
Running the Server
Running with uv (Development)
SSE Transport (Default)
# Using environment variables:
HONEYBADGER_API_KEY=your-key HONEYBADGER_PROJECT_ID=your-project uv run src/honeybadger_mcp_server/server.py
# Using CLI arguments:
uv run src/honeybadger_mcp_server/server.py --api-key your-key --project-id your-project
Using Stdio
uv run src/honeybadger_mcp_server/server.py --transport stdio --api-key your-key --project-id your-project
Running Installed Package
SSE Transport (Default)
# Using environment variables:
HONEYBADGER_API_KEY=your-key HONEYBADGER_PROJECT_ID=your-project honeybadger-mcp-server
# Using CLI arguments:
honeybadger-mcp-server --api-key your-key --project-id your-project
Using Stdio
honeybadger-mcp-server --transport stdio --api-key your-key --project-id your-project
Using Docker
Run with SSE
docker run --env-file .env -p 8050:8050 honeybadger/mcp
Using Stdio
With stdio, the MCP client itself can spin up the MCP server container, so nothing to run at this point.
Integration with MCP Clients
SSE Configuration
Once you have the server running with SSE transport, you can connect to it using this configuration:
{
"mcpServers": {
"honeybadger": {
"transport": "sse",
"url": "http://localhost:8050/sse"
}
}
}
Claude Desktop Configuration
Using SSE Transport (Recommended)
First, start the server:
honeybadger-mcp-server --api-key your-key --project-id your-project
Then add to your Claude Desktop config:
{
"mcpServers": {
"honeybadger": {
"transport": "sse",
"url": "http://localhost:8050/sse"
}
}
}
Using Stdio Transport
Add to your Claude Desktop config:
{
"mcpServers": {
"honeybadger": {
"command": "uv",
"args": [
"run",
"--project",
"/path/to/honeybadger-mcp",
"src/honeybadger_mcp_server/server.py",
"--transport",
"stdio",
"--api-key",
"YOUR-API-KEY",
"--project-id",
"YOUR-PROJECT-ID"
]
}
}
}
Docker Configuration
{
"mcpServers": {
"honeybadger": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"honeybadger/mcp",
"--transport",
"stdio",
"--api-key",
"YOUR-API-KEY",
"--project-id",
"YOUR-PROJECT-ID"
]
}
}
}
Tool Usage Examples
List Faults
result = await client.call_tool("list_faults", {
"q": "RuntimeError", # Optional search term
"created_after": 1710806400, # Unix timestamp (2024-03-19T00:00:00Z)
"occurred_after": 1710806400, # Filter by occurrence time
"limit": 10, # Max 25 results
"order": "recent" # 'recent' or 'frequent'
})
Get Fault Details
result = await client.call_tool("get_fault_details", {
"fault_id": "abc123",
"created_after": 1710806400, # Unix timestamp
"created_before": 1710892800, # Optional end time
"limit": 5 # Number of notices (max 25)
})
Development
Running Tests
# Install dev dependencies
uv pip install -e ".[dev]"
# Run tests
pytest
Code Quality
# Run type checker
pyright
# Run linter
ruff check .
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Verwandte Server
Alpha Vantage MCP Server
SponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Smart AI Bridge
Intelligent Al routing and integration platform for seamless provider switching
EdgeOne Pages MCP
An MCP server and client implementation for EdgeOne Pages Functions, supporting OpenAI-formatted requests.
Fused MCP
A Python-based MCP server for data scientists to run Python code with a Claude client.
Prompts MCP Server
An MCP server for managing and serving prompts from markdown files with YAML frontmatter support.
Unbundle OpenAPI MCP Server
A server for splitting and extracting parts of OpenAPI specifications using Redocly CLI.
MCP Yeoman Server
Search for and run Yeoman generator templates programmatically.
ContextForge
Persistent memory MCP server for Claude — store decisions, code, and knowledge across sessions.
LambdaTest MCP Server
LambdaTest MCP Servers ranging from Accessibility, SmartUI, Automation, and HyperExecute allows you to connect AI assistants with your testing workflow, streamlining setup, analyzing failures, and generating fixes to speed up testing and improve efficiency.
SeedDream 3.0
Generate images using Bytedance's SeedDream 3.0 model via the FAL AI platform.
AltTester® AI Extension
MCP server for game test automation