Knowledge graph-based persistent memory system
A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats.
Entities are the primary nodes in the knowledge graph. Each entity has:
Example:
{
"name": "John_Smith",
"entityType": "person",
"observations": ["Speaks fluent Spanish"]
}
Relations define directed connections between entities. They are always stored in active voice and describe how entities interact or relate to each other.
Example:
{
"from": "John_Smith",
"to": "Anthropic",
"relationType": "works_at"
}
Observations are discrete pieces of information about an entity. They are:
Example:
{
"entityName": "John_Smith",
"observations": [
"Speaks fluent Spanish",
"Graduated in 2019",
"Prefers morning meetings"
]
}
create_entities
entities
(array of objects)
name
(string): Entity identifierentityType
(string): Type classificationobservations
(string[]): Associated observationscreate_relations
relations
(array of objects)
from
(string): Source entity nameto
(string): Target entity namerelationType
(string): Relationship type in active voiceadd_observations
observations
(array of objects)
entityName
(string): Target entitycontents
(string[]): New observations to adddelete_entities
entityNames
(string[])delete_observations
deletions
(array of objects)
entityName
(string): Target entityobservations
(string[]): Observations to removedelete_relations
relations
(array of objects)
from
(string): Source entity nameto
(string): Target entity namerelationType
(string): Relationship typeread_graph
search_nodes
query
(string)open_nodes
names
(string[])Add this to your claude_desktop_config.json:
{
"mcpServers": {
"memory": {
"command": "docker",
"args": ["run", "-i", "-v", "claude-memory:/app/dist", "--rm", "mcp/memory"]
}
}
}
{
"mcpServers": {
"memory": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-memory"
]
}
}
}
The server can be configured using the following environment variables:
{
"mcpServers": {
"memory": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-memory"
],
"env": {
"MEMORY_FILE_PATH": "/path/to/custom/memory.json"
}
}
}
}
MEMORY_FILE_PATH
: Path to the memory storage JSON file (default: memory.json
in the server directory)For quick installation, use one of the one-click installation buttons below:
For manual installation, you can configure the MCP server using one of these methods:
Method 1: User Configuration (Recommended)
Add the configuration to your user-level MCP configuration file. Open the Command Palette (Ctrl + Shift + P
) and run MCP: Open User Configuration
. This will open your user mcp.json
file where you can add the server configuration.
Method 2: Workspace Configuration
Alternatively, you can add the configuration to a file called .vscode/mcp.json
in your workspace. This will allow you to share the configuration with others.
For more details about MCP configuration in VS Code, see the official VS Code MCP documentation.
{
"servers": {
"memory": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-memory"
]
}
}
}
{
"servers": {
"memory": {
"command": "docker",
"args": [
"run",
"-i",
"-v",
"claude-memory:/app/dist",
"--rm",
"mcp/memory"
]
}
}
}
The prompt for utilizing memory depends on the use case. Changing the prompt will help the model determine the frequency and types of memories created.
Here is an example prompt for chat personalization. You could use this prompt in the "Custom Instructions" field of a Claude.ai Project.
Follow these steps for each interaction:
1. User Identification:
- You should assume that you are interacting with default_user
- If you have not identified default_user, proactively try to do so.
2. Memory Retrieval:
- Always begin your chat by saying only "Remembering..." and retrieve all relevant information from your knowledge graph
- Always refer to your knowledge graph as your "memory"
3. Memory
- While conversing with the user, be attentive to any new information that falls into these categories:
a) Basic Identity (age, gender, location, job title, education level, etc.)
b) Behaviors (interests, habits, etc.)
c) Preferences (communication style, preferred language, etc.)
d) Goals (goals, targets, aspirations, etc.)
e) Relationships (personal and professional relationships up to 3 degrees of separation)
4. Memory Update:
- If any new information was gathered during the interaction, update your memory as follows:
a) Create entities for recurring organizations, people, and significant events
b) Connect them to the current entities using relations
c) Store facts about them as observations
Docker:
docker build -t mcp/memory -f src/memory/Dockerfile .
For Awareness: a prior mcp/memory volume contains an index.js file that could be overwritten by the new container. If you are using a docker volume for storage, delete the old docker volume's index.js
file before starting the new container.
This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
An AI-powered gateway for managing over 40 data sources like Alibaba Cloud and mainstream databases, featuring NL2SQL, code generation, and data migration.
Provides read-only access to Apache Iceberg tables using Apache Impala.
A read-only MCP server for querying live Snapchat Ads data using the CData JDBC Driver.
A Model Context Protocol (MCP) server that enables LLMs to interact directly with MongoDB databases
Access MySQL databases to inspect schemas and execute SQL queries via a NodeJS-based server.
Perform database actions on Amazon Redshift via its Data API.
Provides comprehensive building and office address information queries, including enterprise office address search and building information queries.
A read-only MCP server for Pipedrive, enabling LLMs to query live data using the CData JDBC Driver.
An MCP server for local Tabular Models like PowerBI. It allows LLM clients to debug, analyze, and compose DAX queries by connecting to a local Tabular model instance.
Securely query and retrieve data from your ThoughtSpot instance.