CogmemAi
Persistent cognitive memory for Claude Code. Cloud-based semantic search, Ai-powered extraction, project scoping, and compaction recovery.
CogmemAi — Cognitive Memory for Ai Coding Assistants
Your Ai coding assistant forgets everything between sessions. CogmemAi fixes that.
One command. Your assistant remembers your architecture, patterns, decisions, bugs, and preferences — permanently. Works with Claude Code, Cursor, Windsurf, Cline, Continue, and any MCP-compatible tool. Switch editors, switch models, switch machines — your knowledge stays.
What's New in v3
Quantum-Safe Encryption (v3.7)
CogmemAi is the first quantum-safe Ai memory system. All memories are encrypted at rest with quantum-resistant encryption — both in cloud mode and local mode. Your data is protected against today's threats and tomorrow's quantum computers. Encryption is automatic, zero-config, and enabled by default. No setup required.
Choose Your Storage Mode (v3.6)
CogmemAi now runs three ways — pick the one that fits your workflow:
| Cloud (default) | Local | Hybrid | |
|---|---|---|---|
| Best for | Full intelligence, team collaboration, cross-device portability | Zero-config start, offline-only environments | Local speed + cloud brains, travel/unreliable networks |
| Setup | npx cogmemai-mcp setup (choose Cloud) | npx cogmemai-mcp setup (choose Local) | npx cogmemai-mcp setup (choose Hybrid) |
| API key needed | Yes (free) | No | Yes (free) |
| Search | Semantic (by meaning) | Keyword (by text match) | Semantic with local fallback |
| Intelligence Engine | Full — auto-linking, contradiction detection, memory decay, auto-skills, query synthesis | Basic CRUD + keyword search | Full — with offline resilience |
| Team collaboration | Yes | No | Yes |
| Cross-device sync | Automatic | No — data stays on your machine | Automatic with local cache |
| Offline support | Requires internet | Full offline | Falls back to local when offline |
| Encryption | Quantum-safe (server) | Quantum-safe (local) | Quantum-safe (both) |
Cloud mode is the recommended experience. It gives you the full Intelligence Engine — semantic search that finds memories by meaning, auto-linking knowledge graph, contradiction detection, self-improving recall, auto-skills, query synthesis, and team collaboration. Everything that makes CogmemAi more than just a database.
Local mode is the zero-friction entry point. No API key, no account, no internet required. Great for trying CogmemAi in 10 seconds or working in air-gapped environments. You get basic memory storage with keyword search — and when you're ready for more, upgrading to cloud takes one command.
Hybrid mode is for developers who travel or work on unreliable networks. Saves to both local and cloud simultaneously. Reads from cloud when available, falls back to local when offline. Unsynced memories automatically push to cloud when connectivity returns.
Intelligence Engine + Auto-Skills (v3.5)
CogmemAi now gets smarter every time you use it. The Intelligence Engine is a self-improving memory system that learns what matters, connects related knowledge automatically, and synthesizes answers from your entire memory. Auto-Skills takes it further — CogmemAi doesn't just remember, it learns how to behave.
Auto-Skills (Closed-Loop Learning)
- Behavioral skills — CogmemAi automatically synthesizes your corrections, preferences, and patterns into behavioral directives that tell your Ai assistant HOW to work, not just what to know
- Closed learning loop — correct your assistant once, and CogmemAi detects the pattern. After enough evidence accumulates, it generates a skill that prevents the mistake from ever happening again
- Confidence tracking — each skill has a confidence score that rises when it works and drops when it doesn't. Low-confidence skills are automatically retired
- Self-evaluation — skills periodically review themselves against new evidence and adapt, strengthen, or retire as your practices evolve
Intelligence Engine
- Self-improving recall — memories that consistently help you rank higher over time; memories you never use fade naturally. Your recall quality improves automatically with every session
- Auto-linking knowledge graph — related memories are automatically connected when you save them. Your knowledge builds into a web of relationships, not a flat list
- Contradiction detection — when recalled memories conflict with each other, CogmemAi flags the contradiction so you catch stale or outdated information before it causes problems
- Context-aware ranking — tell CogmemAi what you're doing (debugging, planning, reviewing) and it boosts the right types of memories. Debugging? Bug reports and patterns surface first. Planning? Architecture decisions lead
- Query synthesis — ask a question and get one coherent answer synthesized from all your relevant memories, not just a list of matches. Like asking a teammate who's read everything
- Cross-project intelligence — patterns that appear across 3+ projects are automatically promoted to global scope. Your best practices follow you everywhere without manual effort
- Proactive insights — at session start, CogmemAi tells you what you should know before you ask. Stale critical memories, duplicate subjects that need merging, patterns ready for promotion
Also in v3
- Memory health score — 0-100 score with actionable factors
- Session replay — pick up exactly where you left off with automatic session summaries
- Self-tuning memory — importance adjusts based on real usage; stale memories auto-archive
- Auto-ingest README — learn from your README on new projects instantly
- Smart recall — relevant memories surface automatically as you switch topics
- Auto-learning — CogmemAi learns from your sessions automatically
- Task tracking — persistent tasks with status and priority
- Correction learning — teach your assistant to avoid repeated mistakes
- Session reminders — nudges that surface at the start of your next session
- 30 tools — the most complete memory toolkit for Ai coding assistants
Quick Start
npx cogmemai-mcp setup
The setup wizard walks you through three choices: Cloud (recommended — full Ai intelligence), Local (no account needed), or Hybrid (both). Pick your mode, enter your API key if needed, and you're ready in under 60 seconds.
Don't have an API key yet? Get one free at hifriendbot.com/developer. Or choose Local mode to start immediately with no account.
The Problem
Every time you start a new session, you lose context. You re-explain your tech stack, your architecture decisions, your coding preferences. Built-in memory in tools like Claude Code is a flat file with no search, no structure, and no intelligence.
CogmemAi gives your Ai assistant a real memory system:
- Semantic search — finds relevant memories by meaning, not keywords
- Ai-powered extraction — automatically identifies facts worth remembering from your conversations
- Smart deduplication — detects duplicate and conflicting memories automatically
- Privacy controls — auto-detects API keys, tokens, and secrets before storing
- Document ingestion — feed in READMEs and docs to instantly build project context
- Project scoping — memories tied to specific repos, plus global preferences that follow you everywhere
- Smart context — intelligently ranked for maximum relevance to your current work
- Compaction recovery — survives Claude Code context compaction automatically
- Token-efficient — compact context loading that won't bloat your conversation
- Zero setup — no databases, no Docker, no Python, no vector stores
Why Cloud Is the Recommended Mode
CogmemAi offers three storage modes, but cloud is where the magic happens. The Intelligence Engine — semantic search, auto-linking knowledge graph, contradiction detection, self-improving recall, auto-skills, and query synthesis — runs server-side. In cloud mode, your MCP server is a thin HTTP client with zero local databases, zero RAM issues, zero maintenance. All memories are encrypted at rest, so your data is just as secure as local storage — with cross-device portability and team features on top.
Your memory follows you everywhere. Memories created in Claude Code are instantly available in Cursor, Windsurf, Cline, and any MCP-compatible tool. Switch between Opus, Sonnet, Haiku, or any model your editor supports — your memories persist regardless. New laptop? New OS? Log in and your full project knowledge is waiting. A local SQLite file dies with your machine. Cloud memory is permanent.
The privacy argument is a myth. Some memory tools market "local-first" as a privacy advantage. But think about what happens next: every memory your Ai reads gets sent to the model provider (Anthropic, OpenAI, Google) as part of the prompt. Your data leaves your machine at inference time no matter where it's stored. A local SQLite file doesn't protect your memories — it just makes them harder to search, slower to access, and impossible to share. CogmemAi encrypts at rest, transmits over HTTPS, and adds intelligence that local storage simply can't match.
Teams and collaboration. Cloud memory is the only way to share project knowledge across teammates. When one developer saves an architecture decision or documents a bug fix, every team member's Ai assistant knows about it instantly. No syncing, no merge conflicts, no stale local databases. Whether it's two developers or twenty, everyone's assistant has the same up-to-date context. This is impossible with local-only memory solutions.
Compaction Recovery
When your Ai assistant compacts your context, conversation history gets compressed and context is lost. CogmemAi handles this automatically — your context is preserved before compaction and seamlessly restored afterward. No re-explaining, no manual prompting.
The npx cogmemai-mcp setup command configures everything automatically.
Skill
CogmemAi includes a Claude Skill that teaches Claude best practices for memory management — when to save, importance scoring, memory types, and session workflows.
Claude Code:
/skill install https://github.com/hifriendbot/cogmemai-mcp/tree/main/skill/cogmemai-memory
Claude.ai: Upload the skill/cogmemai-memory folder in Settings > Skills.
CLI Commands
npx cogmemai-mcp setup # Interactive setup wizard
npx cogmemai-mcp setup <key> # Setup with API key
npx cogmemai-mcp verify # Test connection and show usage
npx cogmemai-mcp --version # Show installed version
npx cogmemai-mcp help # Show all commands
Manual Setup
If you prefer to configure manually instead of using npx cogmemai-mcp setup:
Option A — Per project (add .mcp.json to your project root):
{
"mcpServers": {
"cogmemai": {
"command": "cogmemai-mcp",
"env": {
"COGMEMAI_API_KEY": "cm_your_api_key_here"
}
}
}
}
For local mode (no API key needed):
{
"mcpServers": {
"cogmemai": {
"command": "cogmemai-mcp",
"env": {
"COGMEMAI_MODE": "local"
}
}
}
}
Option B — Global (available in every project):
# Cloud (default)
claude mcp add cogmemai cogmemai-mcp -e COGMEMAI_API_KEY=cm_your_api_key_here --scope user
# Local (no API key needed)
claude mcp add cogmemai cogmemai-mcp -e COGMEMAI_MODE=local --scope user
# Hybrid (both)
claude mcp add cogmemai cogmemai-mcp -e COGMEMAI_API_KEY=cm_your_api_key_here -e COGMEMAI_MODE=hybrid --scope user
Works With
Claude Code (Recommended)
Automatic setup:
npx cogmemai-mcp setup
Cursor
Add to ~/.cursor/mcp.json:
{
"mcpServers": {
"cogmemai": {
"command": "npx",
"args": ["-y", "cogmemai-mcp"],
"env": { "COGMEMAI_API_KEY": "cm_your_api_key_here" }
}
}
}
Windsurf
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"cogmemai": {
"command": "npx",
"args": ["-y", "cogmemai-mcp"],
"env": { "COGMEMAI_API_KEY": "cm_your_api_key_here" }
}
}
}
Cline (VS Code)
Open VS Code Settings > Cline > MCP Servers, add:
{
"cogmemai": {
"command": "npx",
"args": ["-y", "cogmemai-mcp"],
"env": { "COGMEMAI_API_KEY": "cm_your_api_key_here" }
}
}
Continue
Add to ~/.continue/config.yaml:
mcpServers:
- name: cogmemai
command: npx
args: ["-y", "cogmemai-mcp"]
env:
COGMEMAI_API_KEY: cm_your_api_key_here
CogmemUI
CogmemUI is a free multi-model Ai workspace with built-in CogmemAi memory. Add your CogmemAi API key in Settings > API Keys and your memory is instantly available. CogmemUI also supports connecting any MCP-compatible tool server via Settings > MCP Servers — add endpoints, auto-discover tools, and use them in chat.
Get your free API key at hifriendbot.com/developer.
Tools
CogmemAi provides 30 tools that your Ai assistant uses automatically:
| Tool | Description |
|---|---|
save_memory | Store a fact explicitly (architecture decision, preference, etc.) |
recall_memories | Search memories using natural language (semantic search) |
extract_memories | Ai extracts facts from a conversation exchange automatically |
get_project_context | Load top memories at session start (with smart ranking, health score, and session replay) |
list_memories | Browse memories with filters (paginated, with untyped filter) |
update_memory | Update content, importance, scope, type, category, subject, and tags |
delete_memory | Permanently delete a memory |
bulk_delete | Delete up to 100 memories at once |
bulk_update | Update up to 50 memories at once (content, type, category, tags, etc.) |
get_usage | Check your usage stats and tier info |
export_memories | Export all memories as JSON for backup or transfer |
import_memories | Bulk import memories from a JSON array |
ingest_document | Feed in a document (README, API docs) to auto-extract memories |
save_session_summary | Save a summary of what was accomplished in this session |
list_tags | View all tags in use across your memories |
link_memories | Connect related memories with named relationships |
get_memory_links | Explore the knowledge graph around a memory |
get_memory_versions | View edit history of a memory |
get_analytics | Memory health dashboard with self-tuning insights (filterable by project) |
promote_memory | Promote a project memory to global scope |
consolidate_memories | Merge related memories into comprehensive summaries using Ai |
save_task | Create a persistent task with status and priority tracking |
get_tasks | Retrieve tasks for the current project — pick up where you left off |
update_task | Change task status, priority, or description as you work |
save_correction | Store a "wrong approach → right approach" pattern to avoid repeated mistakes |
set_reminder | Set a reminder that surfaces at the start of your next session |
get_stale_memories | Find memories that may be outdated for review or cleanup |
get_file_changes | See what files changed since your last session |
feedback_memory | Signal whether a recalled memory was useful or irrelevant to improve future recall |
generate_skills | Trigger skill generation from your corrections and preferences — or preview candidates with dry run |
SDKs
Build your own integrations with the CogmemAi API:
- JavaScript/TypeScript:
npm install cogmemai-sdk— npm · GitHub - Python:
pip install cogmemai— PyPI · GitHub
Memory Types
Memories are categorized for better organization and retrieval:
- identity — Who you are, your role, team
- preference — Coding style, tool choices, conventions
- architecture — System design, tech stack, file structure
- decision — Why you chose X over Y
- bug — Known issues, fixes, workarounds
- dependency — Version constraints, package notes
- pattern — Reusable patterns, conventions
- context — General project context
- task — Persistent tasks with status and priority tracking
- correction — Wrong approach → right approach patterns
- reminder — Next-session nudges that auto-expire
Scoping
- Project memories — Architecture, decisions, bugs specific to one repo. Auto-detected from your repository.
- Global memories — Your coding preferences, identity, tool choices. Available in every project.
Pricing
| Free | Pro | Team | Enterprise | |
|---|---|---|---|---|
| Price | $0 | $14.99/mo | $39.99/mo | $99.99/mo |
| Memories | 500 | 2,000 | 10,000 | 50,000 |
| Extractions/mo | 500 | 2,000 | 5,000 | 20,000 |
| Projects | 5 | 20 | 50 | 200 |
Start free. Upgrade when you need more. Or pay per operation with USDC on-chain — no credit card required.
Privacy & Security
- 🛡️ Quantum-safe encryption at rest. All memories are encrypted with quantum-resistant cryptography — in cloud mode and local mode. Protected against both current threats and future quantum computers.
- No source code leaves your machine. We store extracted facts (short sentences), never raw code.
- API keys cryptographically hashed (irreversible) server-side.
- All traffic over HTTPS.
- No model training on your data. Ever.
- Delete everything instantly via dashboard or MCP tool.
- No cross-user data sharing.
Read our full privacy policy.
Environment Variables
| Variable | Required | Description |
|---|---|---|
COGMEMAI_API_KEY | Cloud/Hybrid | Your API key (starts with cm_). Not needed for local mode. |
COGMEMAI_MODE | No | Storage mode: cloud (default with key), local (default without key), or hybrid |
COGMEMAI_LOCAL_DB | No | Path to local database (default: ~/.cogmemai/local.db). Used in local and hybrid modes. |
COGMEMAI_API_URL | No | Custom API URL (default: hifriendbot.com) |
COGMEMAI_ENCRYPTION_KEY | No | Custom encryption passphrase for local mode. If not set, a key is auto-generated. |
COGMEMAI_LOCAL_ENCRYPTION | No | Set to off to disable local encryption (not recommended). |
Support
- Issues: GitHub Issues
- Docs: hifriendbot.com/developer
License
MIT — see LICENSE
Built by HiFriendbot — Better Friends, Better Memories, Better Ai. 🛡️ Quantum Safe.
Servidores relacionados
Reservation System MCP Server
Integrates with the WeChat cloud development reservation system API.
Geo Location Demo
Retrieves user geolocation information using EdgeOne Pages Functions and exposes it via an MCP server.
Honeycomb MCP
Interact with Honeycomb observability data using the Model Context Protocol.
My MCP
A remote MCP server deployable on Cloudflare Workers without authentication.
Google Cloud Healthcare API (FHIR)
Provides healthcare tools for interacting with FHIR resources on Google Cloud Healthcare API and public medical research APIs like PubMed.
Okta MCP Server
Allows AI models to interact with your Okta environment to manage and analyze resources, designed for IAM engineers, security teams, and administrators.
DeepSeek
Access DeepSeek's advanced language models via the DeepSeek API.
Uberall MCP Server
Integrates with the Uberall API to manage business listings, locations, and social media presence.
OpenDota MCP Server
Access real-time Dota 2 statistics, match data, and player information from the OpenDota API.
MCP OpenAI Server
A server for interacting with the OpenAI API. Requires an OpenAI API key.