Ollama MCP Server
A bridge to use local LLMs from Ollama within the Model Context Protocol.
Ollama MCP Server
š A powerful bridge between Ollama and the Model Context Protocol (MCP), enabling seamless integration of Ollama's local LLM capabilities into your MCP-powered applications.
š Features
Complete Ollama Integration
- Full API Coverage: Access all essential Ollama functionality through a clean MCP interface
- OpenAI-Compatible Chat: Drop-in replacement for OpenAI's chat completion API
- Local LLM Power: Run AI models locally with full control and privacy
Core Capabilities
-
š Model Management
- Pull models from registries
- Push models to registries
- List available models
- Create custom models from Modelfiles
- Copy and remove models
-
š¤ Model Execution
- Run models with customizable prompts
- Chat completion API with system/user/assistant roles
- Configurable parameters (temperature, timeout)
- Raw mode support for direct responses
-
š Server Control
- Start and manage Ollama server
- View detailed model information
- Error handling and timeout management
š Getting Started
Prerequisites
- Ollama installed on your system
- Node.js and npm/pnpm
Installation
- Install dependencies:
pnpm install
- Build the server:
pnpm run build
Configuration
Add the server to your MCP configuration:
For Claude Desktop:
MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"ollama": {
"command": "node",
"args": ["/path/to/ollama-server/build/index.js"],
"env": {
"OLLAMA_HOST": "http://127.0.0.1:11434" // Optional: customize Ollama API endpoint
}
}
}
}
š Usage Examples
Pull and Run a Model
// Pull a model
await mcp.use_mcp_tool({
server_name: "ollama",
tool_name: "pull",
arguments: {
name: "llama2"
}
});
// Run the model
await mcp.use_mcp_tool({
server_name: "ollama",
tool_name: "run",
arguments: {
name: "llama2",
prompt: "Explain quantum computing in simple terms"
}
});
Chat Completion (OpenAI-compatible)
await mcp.use_mcp_tool({
server_name: "ollama",
tool_name: "chat_completion",
arguments: {
model: "llama2",
messages: [
{
role: "system",
content: "You are a helpful assistant."
},
{
role: "user",
content: "What is the meaning of life?"
}
],
temperature: 0.7
}
});
Create Custom Model
await mcp.use_mcp_tool({
server_name: "ollama",
tool_name: "create",
arguments: {
name: "custom-model",
modelfile: "./path/to/Modelfile"
}
});
š§ Advanced Configuration
OLLAMA_HOST: Configure custom Ollama API endpoint (default: http://127.0.0.1:11434)- Timeout settings for model execution (default: 60 seconds)
- Temperature control for response randomness (0-2 range)
š¤ Contributing
Contributions are welcome! Feel free to:
- Report bugs
- Suggest new features
- Submit pull requests
š License
MIT License - feel free to use in your own projects!
Built with ā¤ļø for the MCP ecosystem
Related Servers
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Claude Desktop
An MCP server for interacting with Anthropic's Claude on the desktop, based on a DeepLearning.ai course example.
DocGen MCP Server
Automated documentation generator from source files on Google Drive and GitHub.
@4da/mcp-server
Dependency intelligence for AI agents. CVE scanning, health checks, upgrade planning.
Fal.ai OpenAI Image
A server for the Fal.ai text-to-image API, powered by OpenAI's image model. Requires Fal.ai and OpenAI API keys.
Kai
Kai provides a bridge between large language models (LLMs) and your Kubernetes clusters, enabling natural language interaction with Kubernetes resources. The server exposes a comprehensive set of tools for managing clusters, namespaces, pods, deployments, services, and other Kubernetes resources
Claude Code History
Retrieve and analyze Claude Code conversation history from local files.
Helm Package README MCP Server
Search and retrieve detailed information, including READMEs, for Helm charts on Artifact Hub.
Locust MCP Server
An MCP server for running Locust load tests. Configure test parameters like host, users, and spawn rate via environment variables.
MCP Manager
An interactive CLI tool for managing MCP server configurations in the current directory.
pip Package README MCP Server
Fetch READMEs, metadata, and search for Python packages on PyPI.