Provides direct access to local documentation files through a context.md file in the project root.
A lightweight Model Context Protocol (MCP) server that provides direct access to local documentation files - a simple alternative to complex RAG pipelines for project-specific context.
This MCP server reads a single markdown file (context.md
) and exposes its contents through two simple tools:
get_context_overview()
: Lists all section titlessearch_context(query)
: Searches content across all sectionsPerfect for giving LLMs access to project documentation without the overhead of vector databases or embedding models.
git clone https://github.com/unlock-mcp/mcp-docs-server.git
cd mcp-docs-server
pip install -r requirements.txt
# Create a context.md file in the project root with your documentation
echo "# My Project Docs\n\nThis is my documentation." > context.md
Use the shell wrapper for reliable execution:
chmod +x run_context_server.sh
mcp install ./run_context_server.sh --name "docs-server"
Add to your MCP configuration file:
{
"mcpServers": {
"docs-server": {
"timeout": 60,
"type": "stdio",
"command": "/opt/homebrew/bin/python3.11",
"args": [
"/path/to/mcp-docs-server/mcp_context_server.py"
],
"env": {}
}
}
}
Note: Update the Python path to match your system (which python3.11
)
Use the MCP development tools for easy testing:
mcp dev ./run_context_server.sh
This launches a web-based inspector for testing your server.
The server logs to stderr for debugging. Check your MCP client's logs if you encounter issues.
Common issues:
mcp[cli]>=1.2.0
is installedcontext.md
exists in the project rootmcp-docs-server/
├── mcp_context_server.py # Main server implementation
├── run_context_server.sh # Shell wrapper for GUI clients
├── requirements.txt # Python dependencies
├── context.md # Your documentation (create this)
└── README.md # This file
Once configured, you can use these tools in your MCP client:
For a complete walkthrough of building this server from scratch, including common pitfalls and solutions, see the full tutorial: Ditching RAG: Building a Local MCP Server for Your Docs
Contributions welcome! Please feel free to submit issues and pull requests.
MIT License - see LICENSE file for details.
Built with ❤️ by the UnlockMCP team.
A service framework supporting the Model Context Protocol (MCP) to integrate enterprise systems and AI platforms via RESTful, gRPC, and Dubbo protocols.
Search and access Python package metadata, version history, and download statistics from the PyPI repository.
A collection of demo files for MCP servers and clients, illustrating various transport protocols and server capabilities using Python.
Official Zeplin server for AI-assisted UI development.
Captures and manages stdout logs from multiple processes via a named pipe system for real-time debugging and analysis.
Integrates ComfyUI with MCP, allowing the use of custom workflows. Requires a running ComfyUI server.
Integrates with Microsoft's AutoGen framework to enable sophisticated multi-agent conversations via the Model Context Protocol.
Create secure tunnels to expose local servers to the internet using untun.
A server for solving combinatorial, convex, integer, and non-linear optimization problems.
Integrates Ollama's local LLM models with MCP-compatible applications. Requires a local Ollama installation.