A bridge to use local LLMs from Ollama within the Model Context Protocol.
🚀 A powerful bridge between Ollama and the Model Context Protocol (MCP), enabling seamless integration of Ollama's local LLM capabilities into your MCP-powered applications.
🔄 Model Management
🤖 Model Execution
🛠 Server Control
pnpm install
pnpm run build
Add the server to your MCP configuration:
MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"ollama": {
"command": "node",
"args": ["/path/to/ollama-server/build/index.js"],
"env": {
"OLLAMA_HOST": "http://127.0.0.1:11434" // Optional: customize Ollama API endpoint
}
}
}
}
// Pull a model
await mcp.use_mcp_tool({
server_name: "ollama",
tool_name: "pull",
arguments: {
name: "llama2"
}
});
// Run the model
await mcp.use_mcp_tool({
server_name: "ollama",
tool_name: "run",
arguments: {
name: "llama2",
prompt: "Explain quantum computing in simple terms"
}
});
await mcp.use_mcp_tool({
server_name: "ollama",
tool_name: "chat_completion",
arguments: {
model: "llama2",
messages: [
{
role: "system",
content: "You are a helpful assistant."
},
{
role: "user",
content: "What is the meaning of life?"
}
],
temperature: 0.7
}
});
await mcp.use_mcp_tool({
server_name: "ollama",
tool_name: "create",
arguments: {
name: "custom-model",
modelfile: "./path/to/Modelfile"
}
});
OLLAMA_HOST
: Configure custom Ollama API endpoint (default: http://127.0.0.1:11434)Contributions are welcome! Feel free to:
MIT License - feel free to use in your own projects!
Built with ❤️ for the MCP ecosystem
Remote, no-auth MCP server providing AI-powered codebase context and answers
Tools for logging, analyzing, and improving Claude Desktop prompts to enhance prompt engineering skills.
Run Python in a code sandbox.
Share code context with LLMs via Model Context Protocol or clipboard.
An MCP server for accessing Julia documentation and source code.
MCP server for TeamCity, integrates with Claude Desktop and Cursor.
Interact with the Honeybadger API for error monitoring and reporting using LLMs.
An intelligent tool for AI assistants to present multiple UI component designs for user selection.
An example of a remote MCP server deployable on Cloudflare Workers without authentication.
Analyze Solana metrics from InfluxDB and generate Grafana dashboards.