TransformerBee.MCP
An MCP server for the transformer.bee service, configurable via environment variables.
TransformerBee.MCP
This is a simple PoC of a Model Context Protocol (MCP) server for transformer.bee, written in Python.
Under the hood it uses python-mcp and transformerbeeclient.py.
Installation
You can install the MCP server as Python package or pull the Docker image.
Install as Python Package
uv install transformerbeemcp
or if you are using pip:
pip install transformerbeemcp
Install as Docker Image
docker pull ghcr.io/hochfrequenz/transformerbee.mcp:latest
Start the Server via CLI
Python
_The package ships a simple CLI argument to start the server.
In a terminal inside the virtual environment in which you installed the package (here myvenv), call:
(myvenv) run-transformerbee-mcp-server
Docker
docker run --network host -i --rm -e TRANSFORMERBEE_HOST=http://localhost:5021 ghcr.io/hochfrequenz/transformerbee.mcp:latest
(For the environment variables -e ..., see below or the transformerbeeclient.py docs.)
Register MCP Server in Claude Desktop
If you checked out this repository
cd path/to/reporoot/src/transformerbeemcp
mcp install server.py
If you installed the package via pip/uv
Modify your claude_desktop_config.json (that can be found in Claude Desktop menu via "Datei > Einstellungen > Entwickler > Konfiguration bearbeiten"):
{
"mcpServers": {
"TransformerBee.mcp": {
"command": "C:\\github\\MyProject\\.myvenv\\Scripts\\run-transformerbee-mcp-server.exe",
"args": [],
"env": {
"TRANSFORMERBEE_HOST": "http://localhost:5021",
"TRANSFORMERBEE_CLIENT_ID": "",
"TRANSFORMERBEE_CLIENT_SECRET": ""
}
}
}
}
where C:\github\MyProject\.myvenv is the path to your virtual environment where you installed the package and localhost:5021 exposes transformer.bee running in a docker container.
Alternatively, if you haven't configured this handy CLI command
https://github.com/Hochfrequenz/TransformerBee.mcp/blob/c0898769670469df13f23b57a55fe4b71ed9795b/pyproject.toml#L101-L102
you can just call python with non-empty args.
Note that this package marks uv as a dev-dependency, so you might need to install it pip install transformerbeempc[dev] in your virtual environment as well as a lot of MCP tooling assumes you have uv installed.
For details about the environment variables and/or starting transformer.bee locally, check transformerbeeclient.py docs.
If you installed the package via Docker
{
"mcpServers": {
"TransformerBee.mcp": {
"command": "docker",
"args": [
"run",
"--network",
"host",
"-i",
"--rm",
"-e",
"TRANSFORMERBEE_HOST=http://localhost:5021",
"ghcr.io/hochfrequenz/transformerbee.mcp:latest"
],
"env": {
"TRANSFORMERBEE_HOST": "http://localhost:5021",
"TRANSFORMERBEE_CLIENT_ID": "",
"TRANSFORMERBEE_CLIENT_SECRET": ""
}
}
}
}
I'm aware, that using the --network host option is a bit hacky and not best practice.
Serveurs connexes
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Enhanced AutoGen MCP Server
Integrates with Microsoft's AutoGen framework to enable sophisticated multi-agent conversations via the Model Context Protocol.
MCP Agentic AI Crash Course with Python
A comprehensive crash course on the Model Context Protocol (MCP), covering everything from basic concepts to building production-ready MCP servers and clients in Python.
Authless Remote MCP Server
A remote MCP server deployable on Cloudflare Workers that does not require authentication.
BerryRAG
A local RAG system with Playwright MCP integration for Claude and OpenAI embeddings, using local storage.
Smithery Reference Servers
A collection of reference implementations for Model Context Protocol (MCP) servers in Typescript and Python, demonstrating MCP features and SDK usage.
Unity MCP
Perform actions in the Unity Editor for game development using AI clients.
Firebase MCP Server
You can use the Firebase MCP server to give AI-powered development tools the ability to work with your Firebase projects and your app's codebase.
Explorium API
Interact with the Explorium API to access external data for machine learning.
MCP-RAGNAR
A local MCP server implementing Retrieval-Augmented Generation (RAG) with sentence window retrieval and support for multiple file types.
MCP Reticle
Reticle intercepts, visualizes, and profiles JSON-RPC traffic between your LLM and MCP servers in real-time, with zero latency overhead. Stop debugging blind. Start seeing everything.