mcp-server-ollama-bridge

Bridge to local Ollama LLM server. Run Llama, Mistral, Qwen and other local models through MCP.

mcp-server-ollama-bridge

PyPI

MCP Server - Bridge to local Ollama LLM server.

Part of the HumoticaOS / SymbAIon ecosystem.

Installation

pip install mcp-server-ollama-bridge

Usage

With Claude Desktop

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "ollama": {
      "command": "mcp-server-ollama-bridge",
      "env": {
        "OLLAMA_HOST": "http://localhost:11434"
      }
    }
  }
}

With Docker

docker build -t mcp-server-ollama-bridge .
docker run -i -e OLLAMA_HOST=http://host.docker.internal:11434 mcp-server-ollama-bridge

Environment Variables

VariableDefaultDescription
OLLAMA_HOSThttp://localhost:11434Ollama server URL

Features

  • Connect MCP clients to local Ollama LLM
  • Support for all Ollama models
  • Streaming responses
  • Simple configuration

Authors

License

MIT


One Love, One fAmIly!

Official Distribution

This package is officially distributed via:

Note: Third-party directories may list this package but are not official or verified distribution channels for Humotica software.

Related Servers

NotebookLM Web Importer

Import web pages and YouTube videos to NotebookLM with one click. Trusted by 200,000+ users.

Install Chrome Extension