mem0-mcp-server
mem0-mcp-server — exposes Mem0 persistent semantic memory as an MCP HTTP server; supports add/search/read/update/delete operations and semantic search for agent memory.
mem0-mcp-server
MCP server exposing Mem0 v2 API for AI agents to store, retrieve, and search long-term memories using semantic search through the standardized MCP protocol.
Overview
Mem0-MCP Server is a self-hosted MCP (Model Context Protocol) server that bridges AI agents with persistent memory storage. It enables intelligent context retention across conversations and sessions using Mem0's AsyncMemory API.
Key Features:
- MCP Protocol Integration - Exposes Mem0 functionality via MCP tools
- Semantic Memory Search - Similarity-based memory retrieval with vector search
- Multi-Tenant Isolation - User/Agent/Session scoped memory isolation
- Flexible Transport - stdio for local agents, SSE for remote connections
- Configuration Management - Pydantic-based validation with environment variable support
Documentation
| Section | Description |
|---|---|
| API Reference | Complete API documentation for all modules and tools |
| Pattern Guides | Design pattern documentation (Singleton, Repository, etc.) |
| Usage Examples | Getting started and advanced usage guides |
| Deployment | Docker Compose configuration and service details |
| Architecture | System architecture and component interactions |
Quick Start
Installation
# Clone and install
git clone https://github.com/your-org/mem0-mcp-server.git
cd mem0-mcp-server
uv sync
# Set environment variables
export OPENAI_API_KEY="your-api-key"
Configuration
Create ~/.config/mem0-mcp-server/settings.json:
{
"vector_store": {
"provider": "redis",
"config": {
"redis_url": "redis://localhost:6379"
}
},
"llm": {
"provider": "openai",
"config": {
"model": "gpt-4o"
}
},
"embedder": {
"provider": "openai",
"config": {
"model": "text-embedding-3-small"
}
}
}
Running the Server
# SSE Transport (remote connections)
uv run python -m mcp_server.main
# stdio Transport (local AI agents)
export MCP_TRANSPORT=stdio
uv run python -m mcp_server.main
OpenCode Configuration
In ~/config/opencode/opencode.json
OpenCode Configuration
In ~/config/opencode/opencode.json
"mcp": {
"mem0": {
"type": "remote",
"enabled": true,
"url": "http://localhost:8050/sse"
}
}
MCP Tools
| Tool | Description |
|---|---|
add_memory | Store information in long-term memory with semantic indexing |
search_memories | Search memories using semantic similarity |
get_memory | Retrieve specific memory by ID |
update_memory | Update existing memory content |
delete_memory | Remove memory from storage |
list_memories | List memories with filtering and pagination |
Usage Example
# Add memory
result = await client.call_tool("add_memory", {
"messages": [{"role": "user", "content": "I prefer dark mode"}],
"user_id": "alice"
})
# Search memories
result = await client.call_tool("search_memories", {
"query": "theme preferences",
"filters": {"user_id": "alice"},
"limit": 5
})
Architecture
AI Agent → FastMCP Server → MemoryManager → Mem0 AsyncMemory → Redis
│ │
├── SafeLogger (stdout/stderr) │
├── Transport (stdio/SSE) │
└── Config (Pydantic validation) │
Components:
- COMP-1: ConfigLoader - Configuration loading and validation
- COMP-2: FastMCP Server - MCP protocol server
- COMP-3: MemoryManager - Memory operations with multi-tenant isolation
- COMP-4: MCP Tools - Tool definitions
- COMP-5: SafeLogger - Output stream separation
Configuration
Parameter Precedence
Configuration values are resolved in order:
- Tool parameters (direct)
- Environment variables (with MCP_ prefix)
- Config file values
- Hardcoded defaults
Environment Variables
| Variable | Default | Description |
|---|---|---|
OPENAI_API_KEY | (required) | OpenAI API key for LLM |
MCP_TRANSPORT | sse | Transport type (stdio, sse) |
MCP_HOST | 0.0.0.0 | Server bind address |
MCP_PORT | 8080 | Server bind port |
Deployment
Docker
# Using docker-compose
docker-compose up -d
# Using Makefile
make docker-up # Start services with docker compose
make docker-down # Stop services
make docker-logs # Show logs
Services:
| Service | Description |
|---|---|
mem0-mcp | MCP server exposing Mem0 API on port 8050 |
ollama-qwen3-embedding | Ollama with qwen3-embedding:8b for vector embeddings (port 11434) |
ollama-qwen | Ollama with qwen2.5:7b for chat completions (port 11435) |
See Deployment → Docker for detailed configuration.
Kubernetes
# Using Helm chart
helm install mem0-mcp ./charts/mem0-mcp-server
Development
| Command | Description |
|---|---|
make install | Install dependencies with uv |
make lint | Lint code with ruff |
make lint-fix | Auto-fix linting issues |
make typecheck | Type check with pyright |
make test | Run all tests |
make test-unit | Run unit tests only |
make test-coverage | Run tests with coverage report |
make build | Build Docker image |
make run | Run development server |
Run multiple commands: make install && make lint && make typecheck && make test
See Makefile for all available commands including Docker management (docker-up, docker-down, docker-logs, etc.).
Project Structure
mem0-mcp/
├── src/mcp_server/
│ ├── __init__.py # FastMCP singleton
│ ├── lifespan.py # Resource lifecycle
│ ├── transport.py # Transport selection
│ ├── memory/
│ │ ├── manager.py # MemoryManager
│ │ └── lifespan.py # AsyncMemory lifecycle
│ ├── config/
│ │ ├── settings.py # Pydantic models
│ │ └── loader.py # Config file loading
│ ├── tools/
│ │ ├── add_memory.py
│ │ ├── search_memories.py
│ │ └── ...
│ └── utils/
│ └── safe_logger.py # Output separation
├── doc/
│ ├── api/ # API reference
│ ├── patterns/ # Pattern guides
│ ├── examples/ # Usage examples
│ └── architecture/ # Architecture docs
├── tests/
├── Makefile
├── Dockerfile
└── docker-compose.yml
License
MIT License
İlgili Sunucular
Kone.vc
sponsorMonetize your AI agent with contextual product recommendations
n8n MCP Server
Manage n8n workflows, executions, and credentials through the Model Context Protocol.
BrightSite
Manage BrightSite websites: pages, blog posts, components, forms, media, and analytics.
Gorev
A powerful, Go-based task management server with MCP support, AI editor integration, and a VS Code extension. It supports smart task creation, markdown, and project organization.
Jira
Interact with Jira to manage issues, projects, and workflows using the Jira Cloud Platform REST API.
Fraud Detection Engine
中英双语 AI 欺诈文本检测引擎,可识别诈骗、钓鱼等风险,返回风险评分、判定和等级。
Anki MCP Server
Integrate AI assistants with Anki, the popular spaced repetition flashcard software.
CyberEdu MCP Server
This is the Oficial Model Context Protocol (MCP) server for the CyberEdu CTF platform (cyber-edu.co / cyberedu.ro)
Browser Use MCP Server
Automate browser actions using natural language commands. Powered by Playwright and supports multiple LLM providers.
Perfect Wiki
Effortlessly turn your company knowledge into a powerful, always-evolving AI agent that understands and supports your colleagues and customers right in Microsoft Teams & Slack
Whoop
Access the Whoop API to query cycles, recovery, strain, and workout data.