mcp-server-ollama-bridge

Bridge to local Ollama LLM server. Run Llama, Mistral, Qwen and other local models through MCP.

mcp-server-ollama-bridge

PyPI

MCP Server - Bridge to local Ollama LLM server.

Part of the HumoticaOS / SymbAIon ecosystem.

Installation

pip install mcp-server-ollama-bridge

Usage

With Claude Desktop

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "ollama": {
      "command": "mcp-server-ollama-bridge",
      "env": {
        "OLLAMA_HOST": "http://localhost:11434"
      }
    }
  }
}

With Docker

docker build -t mcp-server-ollama-bridge .
docker run -i -e OLLAMA_HOST=http://host.docker.internal:11434 mcp-server-ollama-bridge

Environment Variables

VariableDefaultDescription
OLLAMA_HOSThttp://localhost:11434Ollama server URL

Features

  • Connect MCP clients to local Ollama LLM
  • Support for all Ollama models
  • Streaming responses
  • Simple configuration

Authors

License

MIT


One Love, One fAmIly!

Related Servers