MCP Ollama Agent
A TypeScript agent that integrates MCP servers with Ollama, allowing AI models to use various tools through a unified interface.
TypeScript MCP Agent with Ollama Integration
This project demonstrates integration between Model Context Protocol (MCP) servers and Ollama, allowing AI models to interact with various tools through a unified interface.
✨ Features
- Supports multiple MCP servers (both uvx and npx tested)
- Built-in support for file system operations and web research
- Easy configuration through
mcp-config.jsonsimilar toclaude_desktop_config.json - Interactive chat interface with Ollama integration that should support any tools
- Standalone demo mode for testing web and filesystem tools without an LLM
🚀 Getting Started
-
Prerequisites:
-
Node.js (version 18 or higher)
-
Ollama installed and running
-
Install the MCP tools globally that you want to use:
# For filesystem operations npm install -g @modelcontextprotocol/server-filesystem # For web research npm install -g @mzxrai/mcp-webresearch
-
-
Clone and install:
git clone https://github.com/ausboss/mcp-ollama-agent.git cd mcp-ollama-agent npm install -
Configure your tools and tool supported Ollama model in
mcp-config.json:{ "mcpServers": { "filesystem": { "command": "npx", "args": ["@modelcontextprotocol/server-filesystem", "./"] }, "webresearch": { "command": "npx", "args": ["-y", "@mzxrai/mcp-webresearch"] } }, "ollama": { "host": "http://localhost:11434", "model": "qwen2.5:latest" } } -
Run the demo to test filesystem and webresearch tools without an LLM:
npx tsx ./src/demo.ts -
Or start the chat interface with Ollama:
npm start
⚙️ Configuration
- MCP Servers: Add any MCP-compatible server to the
mcpServerssection - Ollama: Configure host and model (must support function calling)
- Supports both Python (uvx) and Node.js (npx) MCP servers
💡 Example Usage
This example used this model qwen2.5:latest
Chat started. Type "exit" to end the conversation.
You: can you use your list directory tool to see whats in test-directory then use your read file tool to read it to me?
Model is using tools to help answer...
Using tool: list_directory
With arguments: { path: 'test-directory' }
Tool result: [ { type: 'text', text: '[FILE] test.txt' } ]
Assistant:
Model is using tools to help answer...
Using tool: read_file
With arguments: { path: 'test-directory/test.txt' }
Tool result: [ { type: 'text', text: 'rosebud' } ]
Assistant: The content of the file `test.txt` in the `test-directory` is:
rosebud
You: thanks
Assistant: You're welcome! If you have any other requests or need further assistance, feel free to ask.
System Prompts
Some local models may need help with tool selection. Customize the system prompt in ChatManager.ts to improve tool usage.
🤝 Contributing
Contributions welcome! Feel free to submit issues or pull requests.
Server Terkait
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Firebase MCP Server
You can use the Firebase MCP server to give AI-powered development tools the ability to work with your Firebase projects and your app's codebase.
POX MCP Server
An MCP server for the POX SDN controller, enabling network control, management, and analysis using Python and OpenFlow.
ComfyUI MCP Server
An image generation server that connects to a local ComfyUI instance via its API, supporting dynamic workflows.
IIIF Images Server
A server for working with IIIF (International Image Interoperability Framework) manifests and images.
Remote MCP Server (Authless)
A remote MCP server deployable on Cloudflare Workers that operates without authentication.
imgx-mcp
AI image generation and editing MCP server. Text-to-image, text-based editing with iterative refinement. Multi-provider (Gemini + OpenAI).
DocsetMCP
A server for accessing Dash-style documentation sets locally. Requires a local Dash installation.
Codacy
Access the Codacy API to analyze code quality, coverage, and security for your repositories.
HAL (HTTP API Layer)
An MCP server that enables Large Language Models to make HTTP requests and interact with web APIs. It supports automatic tool generation from OpenAPI/Swagger specifications.
Berry MCP Server
A universal framework for easily creating and deploying Model Context Protocol servers with any tools.