MemGPT MCP Server
A server that provides a memory system for LLMs, enabling persistent conversations with various providers like OpenAI, Anthropic, and OpenRouter.
MemGPT MCP Server
A TypeScript-based MCP server that implements a memory system for LLMs. It provides tools for chatting with different LLM providers while maintaining conversation history.
Features
Tools
-
chat- Send a message to the current LLM provider- Takes a message parameter
- Supports multiple providers (OpenAI, Anthropic, OpenRouter, Ollama)
-
get_memory- Retrieve conversation history- Optional
limitparameter to specify number of memories to retrieve - Pass
limit: nullfor unlimited memory retrieval - Returns memories in chronological order with timestamps
- Optional
-
clear_memory- Clear conversation history- Removes all stored memories
-
use_provider- Switch between different LLM providers- Supports OpenAI, Anthropic, OpenRouter, and Ollama
- Persists provider selection
-
use_model- Switch to a different model for the current provider- Supports provider-specific models:
- Anthropic Claude Models:
- Claude 3 Series:
claude-3-haiku: Fastest response times, ideal for tasks like customer support and content moderationclaude-3-sonnet: Balanced performance for general-purpose useclaude-3-opus: Advanced model for complex reasoning and high-performance tasks
- Claude 3.5 Series:
claude-3.5-haiku: Enhanced speed and cost-effectivenessclaude-3.5-sonnet: Superior performance with computer interaction capabilities
- Claude 3 Series:
- OpenAI: 'gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo'
- OpenRouter: Any model in 'provider/model' format (e.g., 'openai/gpt-4', 'anthropic/claude-2')
- Ollama: Any locally available model (e.g., 'llama2', 'codellama')
- Anthropic Claude Models:
- Persists model selection
- Supports provider-specific models:
Development
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
Installation
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"letta-memgpt": {
"command": "/path/to/memgpt-server/build/index.js",
"env": {
"OPENAI_API_KEY": "your-openai-key",
"ANTHROPIC_API_KEY": "your-anthropic-key",
"OPENROUTER_API_KEY": "your-openrouter-key"
}
}
}
}
Environment Variables
OPENAI_API_KEY- Your OpenAI API keyANTHROPIC_API_KEY- Your Anthropic API keyOPENROUTER_API_KEY- Your OpenRouter API key
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
Recent Updates
Claude 3 and 3.5 Series Support (March 2024)
- Added support for latest Claude models:
- Claude 3 Series (Haiku, Sonnet, Opus)
- Claude 3.5 Series (Haiku, Sonnet)
Unlimited Memory Retrieval
- Added support for retrieving unlimited conversation history
- Use
{ "limit": null }with theget_memorytool to retrieve all stored memories - Use
{ "limit": n }to retrieve the n most recent memories - Default limit is 10 if not specified
Related Servers
Repomix
Packs code repositories into a single, AI-friendly file using the repomix tool.
Structurize-MCP
Generates structured CSV files from natural language descriptions using Google Gemini AI.
iOS Development Bridge (idb)
Interact with iOS simulators and devices using Facebook's iOS Development Bridge (idb).
MCP OpenAPI Connector
Connect to any OpenAPI-based API with built-in OAuth2 authentication management.
FastMCP
A TypeScript framework for building MCP servers with client session handling.
Octomind
Create and manage end-to-end tests using the Octomind platform.
Polarion MCP Servers
MCP server for integrating with Polarion Application Lifecycle Management (ALM).
Website Generator MCP
An example MCP server designed for deployment on Cloudflare Workers, supporting both remote and local setups.
XcodeMCP
An MCP server to control Xcode on macOS using JavaScript for Automation (JXA).
SonarQube
Provides seamless integration with SonarQube Server or Cloud, and enables analysis of code snippets directly within the agent context