MemGPT MCP Server
A server that provides a memory system for LLMs, enabling persistent conversations with various providers like OpenAI, Anthropic, and OpenRouter.
MemGPT MCP Server
A TypeScript-based MCP server that implements a memory system for LLMs. It provides tools for chatting with different LLM providers while maintaining conversation history.
Features
Tools
-
chat- Send a message to the current LLM provider- Takes a message parameter
- Supports multiple providers (OpenAI, Anthropic, OpenRouter, Ollama)
-
get_memory- Retrieve conversation history- Optional
limitparameter to specify number of memories to retrieve - Pass
limit: nullfor unlimited memory retrieval - Returns memories in chronological order with timestamps
- Optional
-
clear_memory- Clear conversation history- Removes all stored memories
-
use_provider- Switch between different LLM providers- Supports OpenAI, Anthropic, OpenRouter, and Ollama
- Persists provider selection
-
use_model- Switch to a different model for the current provider- Supports provider-specific models:
- Anthropic Claude Models:
- Claude 3 Series:
claude-3-haiku: Fastest response times, ideal for tasks like customer support and content moderationclaude-3-sonnet: Balanced performance for general-purpose useclaude-3-opus: Advanced model for complex reasoning and high-performance tasks
- Claude 3.5 Series:
claude-3.5-haiku: Enhanced speed and cost-effectivenessclaude-3.5-sonnet: Superior performance with computer interaction capabilities
- Claude 3 Series:
- OpenAI: 'gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo'
- OpenRouter: Any model in 'provider/model' format (e.g., 'openai/gpt-4', 'anthropic/claude-2')
- Ollama: Any locally available model (e.g., 'llama2', 'codellama')
- Anthropic Claude Models:
- Persists model selection
- Supports provider-specific models:
Development
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
Installation
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"letta-memgpt": {
"command": "/path/to/memgpt-server/build/index.js",
"env": {
"OPENAI_API_KEY": "your-openai-key",
"ANTHROPIC_API_KEY": "your-anthropic-key",
"OPENROUTER_API_KEY": "your-openrouter-key"
}
}
}
}
Environment Variables
OPENAI_API_KEY- Your OpenAI API keyANTHROPIC_API_KEY- Your Anthropic API keyOPENROUTER_API_KEY- Your OpenRouter API key
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
Recent Updates
Claude 3 and 3.5 Series Support (March 2024)
- Added support for latest Claude models:
- Claude 3 Series (Haiku, Sonnet, Opus)
- Claude 3.5 Series (Haiku, Sonnet)
Unlimited Memory Retrieval
- Added support for retrieving unlimited conversation history
- Use
{ "limit": null }with theget_memorytool to retrieve all stored memories - Use
{ "limit": n }to retrieve the n most recent memories - Default limit is 10 if not specified
Verwandte Server
Scout Monitoring MCP
SponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
SponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Biel.ai MCP Server
Connect AI tools like Cursor and VS Code to your product documentation using the Biel.ai platform.
Agent Evals by Galileo
Bring agent evaluations, observability, and synthetic test set generation directly into your IDE for free with Galileo's new MCP server
cli-mcp
A command-line interface (CLI) client for calling tools from local and remote MCP servers.
Vega-Lite
Generate visualizations from fetched data using the VegaLite format and renderer.
Postman MCP Generator
Provides JavaScript tools for making API requests, generated by the Postman MCP Generator.
Trade-MCP
A modular trading automation project using the Zerodha Kite Connect API for tool-based and resource-based automation.
Kodus OSV
Open source vulnerability lookup via osv_query/osv_query_batch tools.
x402engine
50+ pay-per-call APIs for AI agents via HTTP 402 crypto micropayments. $0.001–$0.12 per call with USDC and USDm.
Grantex MCP
13-tool MCP server for AI agent authorization. Manage agents, grants, tokens, and audit logs from Claude Desktop, Cursor, or Windsurf. Plus @grantex/mcp-auth for adding OAuth
MCP Proxy Server
Aggregates multiple MCP resource servers into a single interface.