A framework for AI-powered command execution and a plugin-based tool system. It can be run as a standalone service or embedded in other projects to expose a consistent API for invoking tools and managing tasks.
The MCP Server provides a comprehensive framework for AI-powered command execution, plugin-based tools, and advanced features including synchronous script execution, secure Python evaluation, and knowledge management. It can be run as a standalone service or embedded in other projects to expose a consistent API for invoking tools and managing tasks.
The project uses uv
for dependency management. Install dependencies with:
uv sync
Or install in development mode using pip:
pip install -e .
Configuration is controlled by .env
files. Create one from the template and edit it with your settings:
cp config/templates/env.template .env
Important variables include repository paths (GIT_ROOT
), Azure Repo details (AZREPO_ORG
, AZREPO_PROJECT
, AZREPO_REPO
), and optional PRIVATE_TOOL_ROOT
for external tool configuration. The environment manager automatically loads .env
files from the repository root, current directory, and your home directory.
Access settings in code via:
from config import env_manager
env_manager.load()
root = env_manager.get_git_root()
See docs/config_overview.md
for more information.
After installing dependencies and configuring .env
, start the server with:
uv run server/main.py
Connect to the SSE endpoint at http://0.0.0.0:8000/sse
or use the additional routes in server/api.py
.
Background job endpoints are documented in docs/background_jobs_api.md
.
A Dockerfile
is included for running the server in a container.
Build the image with:
docker build -t mcp-server .
Then start the container exposing port 8000
:
docker run -p 8000:8000 mcp-server
See docs/docker.md
for more details.
The server loads prompts and tool definitions from YAML files:
server/prompts.yaml
server/tools.yaml
Private overrides can be placed in server/.private/
or in a folder pointed to by PRIVATE_TOOL_ROOT
. Files are resolved in this order:
PRIVATE_TOOL_ROOT
server/.private/
server/
Tools are modular plugins registered through mcp_tools
. Built-in utilities include:
Additional plugins in the plugins/
directory include Azure DevOps integration, Git operations, knowledge indexing, Kusto queries, and CircleCI workflows. See mcp_tools/docs/creating_tools.md
for details on building custom tools.
The web interface offers comprehensive dashboards:
/tools
– Browse all registered tools and view their details/dataframes
– Interactive DataFrame management and visualization/knowledge
– Knowledge graph exploration and management/pyeval
– Secure Python evaluation interfaceExternal plugins can be installed by declaring them in plugin_config.yaml
. Each
entry should specify a plugin_repo
in the form owner/repository
and an optional
sub_dir
if the plugin lives in a subfolder. Example:
plugins:
- plugin_repo: "github_owner/repo"
sub_dir: "path/to/plugin"
type: "python"
Run the mcp_admin
tool with the refresh_plugins
operation to clone or update
plugins based on this configuration. Pass force=true
to remove all installed
plugins before reinstalling.
Execute all test suites with:
scripts/run_tests.sh
Or run pytest
directly on mcp_core/tests
, mcp_tools/tests
, or server/tests
.
mcp_tools/docs/
and docs/
to learn about tool creation, dependency injection, and advanced features.server/
and try adding your own tools.plugins/
directory for concrete implementations.utils/
directory for advanced utilities like vector stores, graph interfaces, and memory management.CHANGELOG.md
for detailed release notes and recent updates.Editors like Cursor/VSCode can use the SSE endpoint by adding the following to your settings:
{
"mcpServers": {
"mymcp-sse": { "url": "http://0.0.0.0:8000/sse" }
}
}
An MCP server for interacting with Web3 and EVM-compatible chains.
A Model Context Protocol (MCP) server for CODESYS V3 programming environments.
Manage DDEV projects, enabling LLM applications to interact with local development environments through the MCP protocol.
Set up MCP servers in Claude Desktop
Work on your code with JetBrains IDEs
Analyzes Unreal Engine source code to provide context for AI assistants.
Seamlessly bring real-time production context—logs, metrics, and traces—into your local environment to auto-fix code faster.
An MCP server using stdio transport, offering file system access, a calculator, and a code review tool. Requires Node.js.
Generate images using OpenAI's DALL-E API.
A local MCP server for developers that mirrors your in-development MCP server, allowing seamless restarts and tool updates so you can build, test, and iterate on your MCP server within the same AI session without interruption.