LLMling
An MCP server with an LLMling backend that uses YAML files to configure LLM applications.
mcp-server-llmling
LLMling Server Manual
Overview
mcp-server-llmling is a server for the Machine Chat Protocol (MCP) that provides a YAML-based configuration system for LLM applications.
LLMLing, the backend, provides a YAML-based configuration system for LLM applications. It allows to set up custom MCP servers serving content defined in YAML files.
- Static Declaration: Define your LLM's environment in YAML - no code required
- MCP Protocol: Built on the Machine Chat Protocol (MCP) for standardized LLM interaction
- Component Types:
- Resources: Content providers (files, text, CLI output, etc.)
- Prompts: Message templates with arguments
- Tools: Python functions callable by the LLM
The YAML configuration creates a complete environment that provides the LLM with:
- Access to content via resources
- Structured prompts for consistent interaction
- Tools for extending capabilities
Key Features
1. Resource Management
- Load and manage different types of resources:
- Text files (
PathResource) - Raw text content (
TextResource) - CLI command output (
CLIResource) - Python source code (
SourceResource) - Python callable results (
CallableResource) - Images (
ImageResource)
- Text files (
- Support for resource watching/hot-reload
- Resource processing pipelines
- URI-based resource access
2. Tool System
- Register and execute Python functions as LLM tools
- Support for OpenAPI-based tools
- Entry point-based tool discovery
- Tool validation and parameter checking
- Structured tool responses
3. Prompt Management
- Static prompts with template support
- Dynamic prompts from Python functions
- File-based prompts
- Prompt argument validation
- Completion suggestions for prompt arguments
4. Multiple Transport Options
- Stdio-based communication (default)
- Server-Sent Events (SSE) / Streamable HTTP for web clients
- Support for custom transport implementations
Usage
With Zed Editor
Add LLMLing as a context server in your settings.json:
{
"context_servers": {
"llmling": {
"command": {
"env": {},
"label": "llmling",
"path": "uvx",
"args": [
"mcp-server-llmling",
"start",
"path/to/your/config.yml"
]
},
"settings": {}
}
}
}
With Claude Desktop
Configure LLMLing in your claude_desktop_config.json:
{
"mcpServers": {
"llmling": {
"command": "uvx",
"args": [
"mcp-server-llmling",
"start",
"path/to/your/config.yml"
],
"env": {}
}
}
}
Manual Server Start
Start the server directly from command line:
# Latest version
uvx mcp-server-llmling@latest
1. Programmatic usage
from llmling import RuntimeConfig
from mcp_server_llmling import LLMLingServer
async def main() -> None:
async with RuntimeConfig.open(config) as runtime:
server = LLMLingServer(runtime, enable_injection=True)
await server.start()
asyncio.run(main())
2. Using Custom Transport
from llmling import RuntimeConfig
from mcp_server_llmling import LLMLingServer
async def main() -> None:
async with RuntimeConfig.open(config) as runtime:
server = LLMLingServer(
config,
transport="sse",
transport_options={
"host": "localhost",
"port": 3001,
"cors_origins": ["http://localhost:3000"]
}
)
await server.start()
asyncio.run(main())
3. Resource Configuration
resources:
python_code:
type: path
path: "./src/**/*.py"
watch:
enabled: true
patterns:
- "*.py"
- "!**/__pycache__/**"
api_docs:
type: text
content: |
API Documentation
================
...
4. Tool Configuration
tools:
analyze_code:
import_path: "mymodule.tools.analyze_code"
description: "Analyze Python code structure"
toolsets:
api:
type: openapi
spec: "https://api.example.com/openapi.json"
[!TIP] For OpenAPI schemas, you can install Redocly CLI to bundle and resolve OpenAPI specifications before using them with LLMLing. This helps ensure your schema references are properly resolved and the specification is correctly formatted. If redocly is installed, it will be used automatically.
Server Configuration
The server is configured through a YAML file with the following sections:
global_settings:
timeout: 30
max_retries: 3
log_level: "INFO"
requirements: []
pip_index_url: null
extra_paths: []
resources:
# Resource definitions...
tools:
# Tool definitions...
toolsets:
# Toolset definitions...
prompts:
# Prompt definitions...
MCP Protocol
The server implements the MCP protocol which supports:
-
Resource Operations
- List available resources
- Read resource content
- Watch for resource changes
-
Tool Operations
- List available tools
- Execute tools with parameters
- Get tool schemas
-
Prompt Operations
- List available prompts
- Get formatted prompts
- Get completions for prompt arguments
-
Notifications
- Resource changes
- Tool/prompt list updates
- Progress updates
- Log messages
Servidores relacionados
Scout Monitoring MCP
patrocinadorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
patrocinadorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
iOS Device Control
An MCP server to control iOS simulators and real devices, enabling AI assistant integration on macOS.
WordPress Feel Chatbot Plugin
A WordPress plugin that transforms a WordPress site into an MCP server, allowing direct access to its content.
CodeRabbit
Integrate with CodeRabbit AI for automated code reviews, pull request analysis, and report generation.
OpenTelemetry Collector MCP Server
An MCP server for dynamically configuring OpenTelemetry Collectors, including receivers, processors, and exporters.
AgentMode
An all-in-one MCP server for developers, connecting coding AI to databases, data warehouses, data pipelines, and cloud services.
VS Code Settings MCP Server
Programmatically manage Visual Studio Code settings using AI assistants and automated tools.
Repo Map
An MCP server (and command-line tool) to provide a dynamic map of chat-related files from the repository with their function prototypes and related files in order of relevance. Based on the "Repo Map" functionality in Aider.chat
Srclight
Deep code indexing for AI agents — 25 MCP tools: hybrid FTS5 + embedding search, call graphs, git blame/hotspots, build system analysis. Multi-repo workspaces, GPU-accelerated semantic search, 10 languages. Fully local, zero cloud dependencies.
Flutter Package MCP Server
A Model Context Protocol (MCP) server for Flutter packages, designed to integrate with AI assistants like Claude.
Holy Bio MCP
A unified framework for bioinformatics research, integrating multiple specialized MCP servers for longevity and bioinformatics.