LynxPrompt MCP
Browse, search, and manage AI configuration blueprints and prompt hierarchies via LynxPrompt
LynxPrompt-MCP
A tiny bridge that exposes any LynxPrompt instance as an MCP server, enabling LLMs to browse, search, and manage AI configuration blueprints.
What you get
| Type | What for | MCP URI / Tool id |
|---|---|---|
| Resources | Browse blueprints, hierarchies, and user info read-only | lynxprompt://blueprintslynxprompt://blueprint/{id}lynxprompt://hierarchieslynxprompt://hierarchy/{id}lynxprompt://user |
| Tools | Create, update, delete blueprints and manage hierarchies | search_blueprintscreate_blueprintupdate_blueprintdelete_blueprintcreate_hierarchydelete_hierarchy |
Everything is exposed over a single JSON-RPC endpoint (/mcp).
LLMs / Agents can: initialize -> readResource -> listTools -> callTool ... and so on.
Quick-start (Docker Compose)
services:
lynxprompt-mcp:
image: drumsergio/lynxprompt-mcp:latest
ports:
- "127.0.0.1:8080:8080"
environment:
- LYNXPROMPT_URL=https://lynxprompt.com
- LYNXPROMPT_TOKEN=lp_xxx
Security note: The HTTP transport listens on
127.0.0.1:8080by default. If you need to expose it on a network, place it behind a reverse proxy with authentication.
Install via npm (stdio transport)
npx lynxprompt-mcp
Or install globally:
npm install -g lynxprompt-mcp
lynxprompt-mcp
This downloads the pre-built Go binary from GitHub Releases for your platform and runs it with stdio transport. Requires at least one published release.
Local build
git clone https://github.com/GeiserX/lynxprompt-mcp
cd lynxprompt-mcp
# (optional) create .env from the sample
cp .env.example .env && $EDITOR .env
go run ./cmd/server
Configuration
| Variable | Default | Description |
|---|---|---|
LYNXPROMPT_URL | https://lynxprompt.com | LynxPrompt instance URL (without trailing /) |
LYNXPROMPT_TOKEN | (required) | API token in lp_xxx format |
LISTEN_ADDR | 127.0.0.1:8080 | HTTP listen address (Docker sets 0.0.0.0:8080) |
TRANSPORT | (empty = HTTP) | Set to stdio for stdio transport |
Put them in a .env file (from .env.example) or set them in the environment.
Testing
Tested with Inspector and it is currently fully working. Before making a PR, make sure this MCP server behaves well via this medium.
Example configuration for client LLMs
{
"schema_version": "v1",
"name_for_human": "LynxPrompt-MCP",
"name_for_model": "lynxprompt_mcp",
"description_for_human": "Browse, search, and manage AI configuration blueprints from LynxPrompt.",
"description_for_model": "Interact with a LynxPrompt instance that stores AI configuration blueprints. First call initialize, then reuse the returned session id in header \"Mcp-Session-Id\" for every other call. Use readResource to fetch URIs that begin with lynxprompt://. Use listTools to discover available actions and callTool to execute them.",
"auth": { "type": "none" },
"api": {
"type": "jsonrpc-mcp",
"url": "http://localhost:8080/mcp",
"init_method": "initialize",
"session_header": "Mcp-Session-Id"
},
"logo_url": "https://lynxprompt.com/logo.png",
"contact_email": "[email protected]",
"legal_info_url": "https://github.com/GeiserX/lynxprompt-mcp/blob/main/LICENSE"
}
Credits
LynxPrompt -- AI configuration blueprint management
MCP-GO -- modern MCP implementation
GoReleaser -- painless multi-arch releases
Maintainers
Contributing
Feel free to dive in! Open an issue or submit PRs.
LynxPrompt-MCP follows the Contributor Covenant Code of Conduct.
Other MCP Servers by GeiserX
- cashpilot-mcp — Passive income monitoring
- duplicacy-mcp — Backup health monitoring
- genieacs-mcp — TR-069 device management
- pumperly-mcp — Fuel and EV charging prices
- telegram-archive-mcp — Telegram message archive
İlgili Sunucular
CodeRide
Task management redesigned for AI, integrated via the CodeRide MCP server.
Whimsical MCP Server
Create Whimsical diagrams programmatically using Mermaid markup via the Whimsical API.
Vynn
Self-improving AI workflows with natural language backtesting. 21 MCP tools for creating workflows, backtesting trading strategies, parameter sweeps, portfolio optimization, prompt optimization, cron scheduling, and webhook triggers. Install: pip install vynn-mcp
Hjarni
Hjarni includes a built-in MCP server for ChatGPT, Claude, and other compatible clients. Use this page as the protocol and capability reference. If you just want to connect an assistant, start with ChatGPT setup or Claude setup.
Open Agreements
Fill standard legal agreement templates (NDAs, SAFEs, NVCA docs, employment contracts) as DOCX files. Remote MCP server — no install required. MIT licensed.
KnowSync AI
Transform your scattered documentation into AI-ready knowledge that works seamlessly with Claude, Cursor, VS Code, and other AI tools.
Hedera Toolbox
Production MCP server giving AI agents metered access to live Hedera blockchain data. Query token prices, screen identities, monitor governance, write tamper-evident HCS compliance records, and analyze smart contracts — all paid in HBAR micropayments per call.
Clawdentials
Trust layer for AI agent commerce: escrow payments, verifiable reputation, and bounty marketplace with USDC/USDT/BTC Lightning support.
OneNote
Interact with Microsoft OneNote using AI language models like Claude and other LLMs.
n8n Workflow Builder
An MCP server for managing n8n workflows through its API.