A Model Context Protocol (MCP) server for tool integration, configured using a tools.yaml file.
A clean (and possibly naive) implementation of the Model Context Protocol (MCP) for tool integration.
Motivation - as few dependencies as possible, as simple and auditable a configuration as possible.
MCP is early technology. Allowing LLMs to execute system commands is inherently risky. This implementation prioritizes auditability over features - you can read every line that processes LLM requests. Even so, proceed with caution. Only time will tell if MCP's approach is sound.
tools.yaml
are exposed directly via MCP, not through meta-toolsThis workspace contains two crates:
mcp-server
- The MCP server that loads tools from tools.yaml
and exposes them via the protocolmcp-client
- A client library for testing and integration# Build the server
cargo build --release --bin gamecode-mcp2
# Create a tools.yaml file (see examples/tools.yaml)
cp examples/tools.yaml .
# Run the server (it communicates via stdio)
./target/release/gamecode-mcp2
Add to your Claude Desktop configuration:
{
"mcpServers": {
"gamecode": {
"command": "/path/to/gamecode-mcp2"
}
}
}
The mcp-client
crate can be used as a dependency in gamecode-cli for MCP integration.
Tools are defined in tools.yaml
:
tools:
- name: my_tool
description: Description for the LLM
command: /path/to/command # or "internal" for built-in
args:
- name: param1
description: Parameter description
required: true
type: string
cli_flag: --param # null for positional
internal_handler: handler_name # for internal tools
This implementation follows the MCP specification:
initialize
- Handshake with clienttools/list
- Returns all available toolstools/call
- Execute a specific toolTools are exposed directly, not through meta-tools like "run".
Enable AI agents to secure code with Semgrep.
Enhances LLM reasoning by transforming prompts into Chain of Draft or Chain of Thought formats, improving quality and reducing token usage. Requires API keys for external LLM services.
Integrate Testomat.io API with AI assistants for test management.
Interact with the Honeybadger API for error monitoring and reporting using LLMs.
Official Zeplin server for AI-assisted UI development.
Manage Buildkite pipelines and builds.
An MCP server for AI-assisted frontend development using Chrome DevTools. Requires Google Chrome.
Expose API endpoints as strongly typed tools from an OpenAPI specification. Supports OpenAPI 2.0/3.0 in JSON or YAML format, from local or remote files.
Provides Go language updates and best practices in a structured Markdown format for LLM coding agents.
Enable AI Agents to fix Playwright test failures reported to Currents.