Dive AI Agent
An open-source desktop application for hosting MCP servers that integrates with function-calling LLMs.
Dive AI Agent 🤿 🤖
Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨
Features 🎯
- 🌐 Universal LLM Support: Compatible with ChatGPT, Anthropic, Ollama and OpenAI-compatible models
- 💻 Cross-Platform: Available for Windows, MacOS, and Linux
- 🔄 Model Context Protocol: Enabling seamless MCP AI agent integration on both stdio and SSE mode
- ☁️ OAP Cloud Integration: One-click access to managed MCP servers via OAPHub.ai - eliminates complex local deployments
- 🏗️ Dual Architecture: Modern Tauri version alongside traditional Electron version for optimal performance
- 🌍 Multi-Language Support: Traditional Chinese, Simplified Chinese, English, Spanish, Japanese, Korean with more coming soon
- ⚙️ Advanced API Management: Multiple API keys and model switching support with
model_settings.json
- 🛠️ Granular Tool Control: Enable/disable individual MCP tools for precise customization
- 💡 Custom Instructions: Personalized system prompts for tailored AI behavior
- 🔄 Auto-Update Mechanism: Automatically checks for and installs the latest application updates
Recent updates(2025/9/24) - v0.9.8 🎉
Latest Improvements
- 📦 Updated MCP Host: Bumped MCP host version for enhanced functionality
- 🔧 Model Configuration Fixes: Fixed Anthropic model list in Tauri and corrected provider settings in advanced configuration
- 📊 Improved Model Management: Added priority sorting for OAP models and sorted LLM models by creation time
- 📈 Enhanced Download Progress: Improved download progress calculation with better accuracy
- 💾 Better Model Persistence: Enhanced model selection persistence across sessions
- 🎨 UI/UX Refinements: Various interface adjustments for smoother user experience
Previous Major Changes (v0.9.5)
- 🏗️ Dual Architecture Support: Both Electron and Tauri frameworks supported simultaneously
- ⚡ Tauri Version: Modern architecture with optimized installer size (Windows < 30MB)
- 🌐 OAP Platform Integration: Native support for OAPHub.ai cloud services
- 🔐 OAP Authentication: Comprehensive login and authentication support
- 🛠️ Granular MCP Control: Individual tool enable/disable functionality
- 🐧 Linux Tauri Support: Full Tauri framework support on Linux platforms
Platform Availability
- Windows: Available in both Electron and Tauri versions ✅
- macOS: Currently Electron only 🔜
- Linux: Available in both Electron and Tauri versions ✅
Migration Note: Existing local MCP/LLM configurations remain fully supported. OAP integration is additive and does not affect current workflows.
Download and Install ⬇️
Get the latest version of Dive:
Windows users: 🪟
Choose between two architectures:
- Tauri Version (Recommended): Smaller installer (<30MB), modern architecture
- Electron Version: Traditional architecture, fully stable
- Python and Node.js environments will be downloaded automatically after launching
MacOS users: 🍎
- Electron Version: Download the .dmg version
- You need to install Python and Node.js (with npx uvx) environments yourself
- Follow the installation prompts to complete setup
Linux users: 🐧
Choose between two architectures:
- Tauri Version (Recommended): Modern architecture with smaller installer size
- Electron Version: Traditional architecture with .AppImage format
- You need to install Python and Node.js (with npx uvx) environments yourself
- For Ubuntu/Debian users:
- You may need to add
--no-sandbox
parameter - Or modify system settings to allow sandbox
- Run
chmod +x
to make the AppImage executable
- You may need to add
- For Arch users:
- If you are using Arch Linux, you can install dive using an AUR helper. For example:
paru -S dive-ai
- If you are using Arch Linux, you can install dive using an AUR helper. For example:
MCP Setup Options
Dive offers two ways to access MCP tools: OAP Cloud Services (recommended for beginners) and Local MCP Servers (for advanced users).
Option 1: Local MCP Servers 🛠️
For advanced users who prefer local control. The system comes with a default echo MCP Server, and you can add more powerful tools like Fetch and Youtube-dl.
Option 2: OAP Cloud Services ☁️
The easiest way to get started! Access enterprise-grade MCP tools instantly:
- Sign up at OAPHub.ai
- Connect to Dive using one-click deep links or configuration files
- Enjoy managed MCP servers with zero setup - no Python, Docker, or complex dependencies required
Benefits:
- ✅ Zero configuration needed
- ✅ Cross-platform compatibility
- ✅ Enterprise-grade reliability
- ✅ Automatic updates and maintenance
Quick Local Setup
Add this JSON configuration to your Dive MCP settings to enable local tools:
"mcpServers":{
"fetch": {
"command": "uvx",
"args": [
"mcp-server-fetch",
"--ignore-robots-txt"
],
"enabled": true
},
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/path/to/allowed/files"
],
"enabled": true
},
"youtubedl": {
"command": "npx",
"args": [
"@kevinwatt/yt-dlp-mcp"
],
"enabled": true
}
}
Using Streamable HTTP for Cloud MCP Services
You can connect to external cloud MCP servers via Streamable HTTP transport. Here's the Dive configuration example for SearXNG service from OAPHub:
{
"mcpServers": {
"SearXNG_MCP_Server": {
"transport": "streamable",
"url": "https://proxy.oaphub.ai/v1/mcp/181672830075666436",
"headers": {
"Authorization": "GLOBAL_CLIENT_TOKEN"
}
}
}
}
Reference: @https://oaphub.ai/mcp/181672830075666436
Using SSE Server (Non-Local MCP)
You can also connect to external MCP servers (not local ones) via SSE (Server-Sent Events). Add this configuration to your Dive MCP settings:
{
"mcpServers": {
"MCP_SERVER_NAME": {
"enabled": true,
"transport": "sse",
"url": "YOUR_SSE_SERVER_URL"
}
}
}
Additional Setup for yt-dlp-mcp
yt-dlp-mcp requires the yt-dlp package. Install it based on your operating system:
Windows
winget install yt-dlp
MacOS
brew install yt-dlp
Linux
pip install yt-dlp
Build 🛠️
See BUILD.md for more details.
Connect With Us 🌐
- 💬 Join our Discord
- 🐦 Follow us on Twitter/X Reddit Thread
- ⭐ Star us on GitHub
- 🐛 Report issues on our Issue Tracker
Related Servers
Flowise
Integrate with the Flowise API to create predictions and manage chatflows and assistants.
Gemini Image Generation
Generate images using Google's Gemini API.
Code Assistant
A server for code modification and generation using Large Language Models.
MCP Front
An OAuth 2.1 proxy for MCP servers that enables single sign-on with Google, domain validation, and per-user tokens.
Image Generator MCP Server
Generate placeholder images with specified dimensions and colors, and save them to a file path.
Onyx MCP Server
Search and query Onyx programming language documentation and GitHub code examples.
Algorand
A comprehensive MCP server for tooling interactions(40+) and resource accessibility(60+) plus many useful prompts to interact with Algorand Blockchain.
Nuxt MCP
MCP server helping models to understand your Vite/Nuxt app better.
nUR MCP Server
An intelligent robot control middleware for natural language interaction with industrial robots, powered by LLMs. It integrates with Universal Robots and supports real-time, multi-robot control.
Aptos NPM MCP
A MCP server for interacting with Aptos NPM packages.