Dive AI Agent
An open-source desktop application for hosting MCP servers that integrates with function-calling LLMs.
Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨

Features 🎯
- 🌐 Universal LLM Support: Compatible with ChatGPT, Anthropic, Ollama and OpenAI-compatible models
- 💻 Cross-Platform: Available for Windows, MacOS, and Linux
- 🔄 Model Context Protocol: Enabling seamless MCP AI agent integration on both stdio and SSE mode
- ☁️ OAP Cloud Integration: One-click access to managed MCP servers via OAPHub.ai - eliminates complex local deployments
- 🏗️ Dual Architecture: Modern Tauri version alongside traditional Electron version for optimal performance
- 🌍 Multi-Language Support: Supports 24+ languages including English, Traditional Chinese, Simplified Chinese, Spanish, Japanese, Korean, German, French, Italian, Portuguese, Russian, Thai, Vietnamese, Filipino, Indonesian, Polish, Turkish, Ukrainian, Swedish, Norwegian, Finnish, and Lao
- ⚙️ Advanced API Management: Multiple API keys and model switching support with
model_settings.json - 🛠️ Granular Tool Control: Enable/disable individual MCP tools for precise customization
- 💡 Custom Instructions: Personalized system prompts for tailored AI behavior
- ⌨️ Keyboard Shortcuts: Comprehensive hotkey support for efficient navigation and operations (rename, settings, reload, new chat, etc.)
- 📝 Chat Draft Saving: Automatically saves chat input drafts to prevent data loss
- 🔄 Auto-Update Mechanism: Automatically checks for and installs the latest application updates
- 🔐 MCP Server Authentication: Added support for MCP server authentication
⚠️ Note: This feature is currently unstable and may require frequent re-authorization
- 🛠️ Built-in Local Tools: Pre-configured tools available out of the box - Fetch (web requests), File Manager (read/write files), and Bash (command execution)
- 🤖 MCP Server Installer Agent: Intelligent agent that helps you install and configure MCP servers automatically
- 🔔 Multiple Elicitation Support: Handle multiple MCP elicitation requests simultaneously in the UI
- 📁 @ File Path Search: Extended @ keyword in chat input to search file paths
Recent updates(2026/02/26) - v0.14.0+ 🎉
- 🛠️ Skills & Slash Commands: Support skills and more slash commands
- 🔍 Chat History Search: Support chat history search
Platform Availability
| Platform | Electron | Tauri |
|---|---|---|
| Windows | ✅ | ✅ |
| macOS | ✅ | 🔜 |
| Linux | ✅ | ✅ |
Migration Note: Existing local MCP/LLM configurations remain fully supported. OAP integration is additive and does not affect current workflows.
Download and Install ⬇️
Get the latest version of Dive:
Windows users: 🪟
Choose between two architectures:
- Tauri Version (Recommended): Smaller installer (<30MB), modern architecture
- Electron Version: Traditional architecture, fully stable
- Python and Node.js environments will be downloaded automatically after launching
MacOS users: 🍎
- Electron Version: Download the .dmg version
- You need to install Python and Node.js (with npx uvx) environments yourself
- Follow the installation prompts to complete setup
Linux users: 🐧
Choose between two architectures:
- Tauri Version: Modern architecture with smaller installer size
- Electron Version (Recommended): Traditional architecture with .AppImage format
- You need to install Python and Node.js (with npx uvx) environments yourself
- For Ubuntu/Debian users:
- You may need to add
--no-sandboxparameter - Or modify system settings to allow sandbox
- Run
chmod +xto make the AppImage executable
- You may need to add
- For Arch users:
- If you are using Arch Linux, you can install dive using an AUR helper. For example:
paru -S dive-ai
- If you are using Arch Linux, you can install dive using an AUR helper. For example:
MCP Setup Options
For more detailed instructions, please see MCP Servers Setup.
The easiest way to get started! Access enterprise-grade MCP tools instantly:
- Sign up at OAPHub.ai
- Connect to Dive using one-click deep links or configuration files
- Enjoy managed MCP servers with zero setup - no Python, Docker, or complex dependencies required
Benefits:
- ✅ Zero configuration needed
- ✅ Cross-platform compatibility
- ✅ Enterprise-grade reliability
- ✅ Automatic updates and maintenance
Build 🛠️
See BUILD.md for more details.
Contributing 🤝
We welcome contributions from the community! Here's how you can help:
Development Setup
- Fork the repository
- Clone your fork:
git clone https://github.com/YOUR_USERNAME/Dive.git - Install dependencies:
npm install - Start development:
npm run dev(Electron) orcargo tauri dev(Tauri) - Make your changes and test thoroughly
- Submit a pull request
License 📄
Dive is open-source software licensed under the MIT License.
Connect With Us 🌐
- 💬 Join our Discord
- 🐦 Follow us on Twitter/X Reddit Thread
- ⭐ Star us on GitHub
- 🐛 Report issues on our Issue Tracker
Related Servers
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Bruno MCP Server
Execute Bruno collections using the Bruno CLI, with support for environment files and detailed test results.
Locust MCP Server
An MCP server for running Locust load tests. Configure test parameters like host, users, and spawn rate via environment variables.
mcp4gql
An MCP server that acts as a bridge, allowing MCP clients to interact with a target GraphQL API.
PydanticRPC
A Python library for building gRPC/ConnectRPC services with Pydantic models, featuring automatic protobuf generation and AI assistant tool exposure.
Layered Code
An AI-assisted web development tool for creating, modifying, and deploying code through natural language conversations.
Authless Remote MCP Server
An authentication-free remote MCP server designed for deployment on Cloudflare Workers.
Tox Testing
Executes tox commands to run Python tests with pytest. Requires the TOX_APP_DIR environment variable to be set.
VSCode MCP
Enables AI agents and assistants to interact with Visual Studio Code through the Model Context Protocol.
Gemini MCP
Integrate the full power of Gemini Pro 3 to Claude Code
Replicate Ideogram V3 Balanced
Generate images using the Ideogram V3 Balanced model on Replicate.