Integrates Ollama's local LLM models with MCP-compatible applications. Requires a local Ollama installation.
An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.
Install globally via npm:
npm install -g @rawveg/ollama-mcp
To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:
{
"mcpServers": {
"@rawveg/ollama-mcp": {
"command": "npx",
"args": [
"-y",
"@rawveg/ollama-mcp"
]
}
}
}
The settings file location varies by application:
claude_desktop_config.json
in the Claude app data directorycline_mcp_settings.json
in the VS Code global storageSimply run:
ollama-mcp
The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:
PORT=3457 ollama-mcp
PORT
: Server port (default: 3456). Can be used when running directly:
# When running directly
PORT=3457 ollama-mcp
OLLAMA_API
: Ollama API endpoint (default: http://localhost:11434)GET /models
- List available modelsPOST /models/pull
- Pull a new modelPOST /chat
- Chat with a modelGET /models/:name
- Get model detailsgit clone https://github.com/rawveg/ollama-mcp.git
cd ollama-mcp
npm install
npm run build
npm start
Contributions are welcome! Please feel free to submit a Pull Request.
However, this does not grant permission to incorporate this project into third-party services or commercial platforms without prior discussion and agreement. While I previously accepted contributions (such as a Dockerfile and related README updates) to support integration with services like Smithery, recent actions by a similar service — Glama — have required a reassessment of this policy.
Glama has chosen to include open-source MCP projects in their commercial offering without notice or consent, and subsequently created issue requests asking maintainers to perform unpaid work to ensure compatibility with their platform. This behaviour — leveraging community labour for profit without dialogue or compensation — is not only inconsiderate, but ethically problematic.
As a result, and to protect the integrity of this project and its contributors, the licence has been updated to the GNU Affero General Public License v3.0 (AGPL-3.0). This change ensures that any use of the software — particularly in commercial or service-based platforms — must remain fully compliant with the AGPL's terms and obtain a separate commercial licence. Merely linking to the original source is not sufficient where the project is being actively monetised. If you wish to include this project in a commercial offering, please get in touch first to discuss licensing terms.
AGPL v3.0
This project was previously MIT-licensed. As of 20th April 2025, it is now licensed under AGPL-3.0 to prevent unauthorised commercial exploitation. If your use of this project predates this change, please refer to the relevant Git tag or commit for the applicable licence.
A specialized MCP gateway for LLM enhancement prompts and jailbreaks with dynamic schema adaptation. Provides prompts for different LLMs using an enum-based approach.
Client implementation for Mastra, providing seamless integration with MCP-compatible AI models and tools.
Enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini's 1M context window.
Remote server (SSE/Streamable) for the latest Svelte and SvelteKit documentation
MCP server to provide golang packages and their information from pkg.go.dev
Work on dataset metadata with MLCommons Croissant validation and creation.
Enable AI Agents to fix Playwright test failures reported to Currents.
Server for advanced AI-driven video editing, semantic search, multilingual transcription, generative media, voice cloning, and content moderation.
🍎 MCP server for Xcode's xctrace, xcrun, xcodebuild.
Create and read feature flags, review experiments, generate flag types, search docs, and interact with GrowthBook's feature flagging and experimentation platform.