MCP-RAGNAR
A local MCP server implementing Retrieval-Augmented Generation (RAG) with sentence window retrieval and support for multiple file types.
MCP-RAGNAR - a local RAG MCP Server
A local MCP server that implements RAG (Retrieval-Augmented Generation) with sentence window retrieval.
Features
- Document indexing with support for multiple file types (txt, md, pdf, doc, docx)
- Sentence window retrieval for better context understanding
- Configurable embedding models (OpenAI or local hugging face mode - i.e BAAI/bge-large-en-v1.5)
- MCP server integration for easy querying
Requirements
- Python 3.10+
- UV package manager
Installation
- Clone the repository:
git clone <repository-url>
cd mcp-ragnar
- Install dependencies using UV:
uv pip install -e .
Usage
Indexing Documents
You can index documents either programmatically or via the command line.
Indexing
python -m indexer.index /path/to/documents /path/to/index
# to change the default local embedding model and chunk size
python -m indexer.index /path/to/documents /path/to/index --chunk-size=512 --embed-model BAAI/bge-small-en-v1.5
# With OpenAI embedding endpoint (put your OPENAI_API_KEY in env)
python -m indexer.index /path/to/documents /path/to/index --embed-endpoint https://api.openai.com/v1 --embed-model text-embedding-3-small --tokenizer-model o200k_base
# Get help
python -m indexer.index --help
Running the MCP Server
Configuration
can be supplied as env var or .env file
EMBED_ENDPOINT: (Optional) Path to an OpenAI compatible embedding endpoint (ends with /v1). If not set, a local Hugging Face model is used by default.EMBED_MODEL: (Optional) Name of the embedding model to use. Default value of BAAI/bge-large-en-v1.5.INDEX_ROOT: The root directory for the index, used by the retriever. This is mandatory for MCP (Multi-Cloud Platform) querying.MCP_DESCRIPTION: The exposed name and description for the MCP server, used for MCP querying only. This is mandatory for MCP querying. For example: "RAG to my local personal documents"INDEX_ROOT: the root path of the index
in SSE mode it will listen to http://localhost:8001/ragnar
python server/sse.py
in stdio mode
install locally as an uv tool
uv tool install .
Claude Desktop:
Update the following:
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Example :
{
"mcpServers": {
"mcp-ragnar": {
"command": "uvx",
"args": [
"mcp-ragnar"
],
"env": {
"OPENAI_API_KEY": "",
"EMBED_ENDPOINT": "https://api.openai.com/v1",
"EMBED_MODEL": "text-embedding-3-small",
"MCP_DESCRIPTION": "My local Rust documentation",
"INDEX_ROOT": "/tmp/index"
}
}
}
}
License
GNU General Public License v3.0
İlgili Sunucular
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Snak
An agent engine for creating powerful and secure AI Agents powered by Starknet.
clj-kondo-MCP
Clojure linter
paytoll-mcp
Access 20+ DeFi, crypto, and AI endpoints through micro-payments. Get Aave rates, build DeFi transactions, fetch crypto prices, resolve ENS names, search Twitter, and query LLMs - all paid per-call with USDC on Base. No API keys needed, payment is the auth
mcp-of-mcps
MCP of MCPs is a meta-server that merges all your MCP servers into a single smart endpoint. It gives AI agents instant tool discovery, selective schema loading, and massively cheaper execution, so you stop wasting tokens and time. With persistent tool metadata, semantic search, and direct code execution between tools, it turns chaotic multi-server setups into a fast, efficient, hallucination-free workflow. It also automatically analyzes the tools output schemas if not exist and preserves them across sessions for consistent behavior.
OSSInsight
Analyze GitHub repositories, developers, and organizations with data from OSSInsight.io.
mcp-doctor
Diagnose, secure, and benchmark your MCP servers
Nereid - Mermaid charts
Create and explore Mermaid diagrams in collaboration with AI agents
MemGPT MCP Server
A server that provides a memory system for LLMs, enabling persistent conversations with various providers like OpenAI, Anthropic, and OpenRouter.
PromptEasy.EU
The first EU-sovereign, version-controlled prompt library that natively exposes your team’s templates as a managed MCP Server for agentic discovery.
agentwallet-mcp
Server-side EVM wallet for Ai agents. Send transactions, manage tokens, and interact with smart contracts across multiple chains.