Integrates Ollama's local LLM models with MCP-compatible applications. Requires a local Ollama installation.
An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.
Install globally via npm:
npm install -g @rawveg/ollama-mcp
To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:
{
"mcpServers": {
"@rawveg/ollama-mcp": {
"command": "npx",
"args": [
"-y",
"@rawveg/ollama-mcp"
]
}
}
}
The settings file location varies by application:
claude_desktop_config.json
in the Claude app data directorycline_mcp_settings.json
in the VS Code global storageSimply run:
ollama-mcp
The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:
PORT=3457 ollama-mcp
PORT
: Server port (default: 3456). Can be used when running directly:
# When running directly
PORT=3457 ollama-mcp
OLLAMA_API
: Ollama API endpoint (default: http://localhost:11434)GET /models
- List available modelsPOST /models/pull
- Pull a new modelPOST /chat
- Chat with a modelGET /models/:name
- Get model detailsgit clone https://github.com/rawveg/ollama-mcp.git
cd ollama-mcp
npm install
npm run build
npm start
Contributions are welcome! Please feel free to submit a Pull Request.
However, this does not grant permission to incorporate this project into third-party services or commercial platforms without prior discussion and agreement. While I previously accepted contributions (such as a Dockerfile and related README updates) to support integration with services like Smithery, recent actions by a similar service — Glama — have required a reassessment of this policy.
Glama has chosen to include open-source MCP projects in their commercial offering without notice or consent, and subsequently created issue requests asking maintainers to perform unpaid work to ensure compatibility with their platform. This behaviour — leveraging community labour for profit without dialogue or compensation — is not only inconsiderate, but ethically problematic.
As a result, and to protect the integrity of this project and its contributors, the licence has been updated to the GNU Affero General Public License v3.0 (AGPL-3.0). This change ensures that any use of the software — particularly in commercial or service-based platforms — must remain fully compliant with the AGPL's terms and obtain a separate commercial licence. Merely linking to the original source is not sufficient where the project is being actively monetised. If you wish to include this project in a commercial offering, please get in touch first to discuss licensing terms.
AGPL v3.0
This project was previously MIT-licensed. As of 20th April 2025, it is now licensed under AGPL-3.0 to prevent unauthorised commercial exploitation. If your use of this project predates this change, please refer to the relevant Git tag or commit for the applicable licence.
Integrate Testomat.io API with AI assistants for test management.
A simple note storage system with tools for adding notes and generating scripts from them.
Manipulate Adventure Game Studio (AGS) compiled room (.crm) files to enable AI-powered game development.
Generate high-quality images using Google's Imagen 3.0 model via the Gemini API.
An MCP server for the transformer.bee service, configurable via environment variables.
A self-hosted MCP Server registry for private AI agents, supporting both PostgreSQL and SQLite databases.
Create and modify wireframes in the Frame0 app through natural language prompts.
A FastAPI application demonstrating MCP integration for mathematical operations and tool registration.
A TypeScript boilerplate for building MCP servers with streamable HTTP and OAuth proxy support.
Enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini's 1M context window.