Langfuse Prompt Management
Open-source tool for collaborative editing, versioning, evaluating, and releasing prompts.
Langfuse Prompt Management MCP Server
Model Context Protocol (MCP) Server for Langfuse Prompt Management. This server allows you to access and manage your Langfuse prompts through the Model Context Protocol.
Demo
Quick demo of Langfuse Prompts MCP in Claude Desktop (unmute for voice-over explanations):
https://github.com/user-attachments/assets/61da79af-07c2-4f69-b28c-ca7c6e606405
Features
MCP Prompt
This server implements the MCP Prompts specification for prompt discovery and retrieval.
-
prompts/list: List all available prompts- Optional cursor-based pagination
- Returns prompt names and their required arguments, limitation: all arguments are assumed to be optional and do not include descriptions as variables do not have specification in Langfuse
- Includes next cursor for pagination if there's more than 1 page of prompts
-
prompts/get: Get a specific prompt- Transforms Langfuse prompts (text and chat) into MCP prompt objects
- Compiles prompt with provided variables
Tools
To increase compatibility with other MCP clients that do not support the prompt capability, the server also exports tools that replicate the functionality of the MCP Prompts.
-
get-prompts: List available prompts- Optional
cursorparameter for pagination - Returns a list of prompts with their arguments
- Optional
-
get-prompt: Retrieve and compile a specific prompt- Required
nameparameter: Name of the prompt to retrieve - Optional
argumentsparameter: JSON object with prompt variables
- Required
Development
npm install
# build current file
npm run build
# test in mcp inspector
npx @modelcontextprotocol/inspector node ./build/index.js
Usage
Step 1: Build
npm install
npm run build
Step 2: Add the server to your MCP servers:
Claude Desktop
Configure Claude for Desktop by editing claude_desktop_config.json
{
"mcpServers": {
"langfuse": {
"command": "node",
"args": ["<absolute-path>/build/index.js"],
"env": {
"LANGFUSE_PUBLIC_KEY": "your-public-key",
"LANGFUSE_SECRET_KEY": "your-secret-key",
"LANGFUSE_BASEURL": "https://cloud.langfuse.com"
}
}
}
}
Make sure to replace the environment variables with your actual Langfuse API keys. The server will now be available to use in Claude Desktop.
Cursor
Add new server to Cursor:
- Name:
Langfuse Prompts - Type:
command - Command:
LANGFUSE_PUBLIC_KEY="your-public-key" LANGFUSE_SECRET_KEY="your-secret-key" LANGFUSE_BASEURL="https://cloud.langfuse.com" node absolute-path/build/index.js
Limitations
The MCP Server is a work in progress and has some limitations:
- Only prompts with a
productionlabel in Langfuse are returned - All arguments are assumed to be optional and do not include descriptions as variables do not have specification in Langfuse
- List operations require fetching each prompt individually in the background to extract the arguments, this works but is not efficient
Contributions are welcome! Please open an issue or a PR (repo) if you have any suggestions or feedback.
Related Servers
MCP Project Setup
A starter project with setup instructions and example MCP servers, including a weather server.
PI API MCP Server
An MCP server for interacting with the PI Dashboard API.
MCP Java Decompiler Server
Decompile Java class files from file paths, package names, or JAR files using a JavaScript port of the CFR decompiler.
Lean LSP
Interact with the Lean theorem prover via the Language Server Protocol (LSP), enabling LLM agents to understand, analyze, and modify Lean projects.
Remote MCP Server (Authless)
An example of a remote MCP server deployable on Cloudflare Workers, without authentication.
Claude Code MCP
Orchestrates multiple Claude Code agents across iTerm2 sessions, providing centralized management and inter-agent communication.
Remote MCP Server (Authless)
A remote MCP server deployable on Cloudflare Workers without authentication.
XTQuantAI
Integrates the xtquant quantitative trading platform with an AI assistant, enabling AI to access and operate quantitative trading data and functions.
MCP Documentation Server
Integrates LLM applications with documentation sources using the Model Context Protocol.
Fal.ai OpenAI Image
A server for the Fal.ai text-to-image API, powered by OpenAI's image model. Requires Fal.ai and OpenAI API keys.