obsidian-brain
Standalone Obsidian MCP server with semantic search, knowledge graph analytics (PageRank, Louvain, shortest path), and vault editing — no plugin, no REST API, works when Obsidian is closed.
obsidian-brain
A standalone Node MCP server that gives Claude (and any other MCP client) semantic search + knowledge graph + vault editing over an Obsidian vault. Runs as one local stdio process — no plugin, no HTTP bridge, no API key, nothing hosted. Your vault content never leaves your machine.
📖 Full docs → sweir1.github.io/obsidian-brain Companion plugin →
sweir1/obsidian-brain-plugin(optional — unlocksactive_note,dataview_query,base_query)
Contents — Why · Quick start · What you get · How it works · Companion plugin · Troubleshooting · Recent releases
Why obsidian-brain?
- Works without Obsidian running — unlike Local REST API-based servers, obsidian-brain reads
.mdfiles directly from disk. Obsidian can be closed; your vault is just a folder. - No Local REST API plugin required — nothing to install inside Obsidian for the core experience.
- Chunk-level semantic search with RRF hybrid retrieval — embeddings at markdown-heading granularity, fused with FTS5 BM25 via Reciprocal Rank Fusion. Finds the exact chunk, ranks on meaning.
- The only Obsidian MCP server with PageRank + Louvain + graph analytics — ask for your vault's most influential notes, bridging notes, theme clusters. Nobody else ships this.
- Ollama provider for high-quality local embeddings — switch to
bge-m3,nomic-embed-text,qwen3-embedding-8b, etc. with one env var. - All in one
npxinstall — no clone, no build, no API key, no hosted endpoint. Vault content never leaves your machine.
Quick start
Requires Node 20+ and an Obsidian vault (or any folder of .md files — Obsidian itself is optional).
Wire obsidian-brain into your MCP client. Example for Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json):
{
"mcpServers": {
"obsidian-brain": {
"command": "npx",
"args": ["-y", "obsidian-brain@latest", "server"],
"env": { "VAULT_PATH": "/absolute/path/to/your/vault" }
}
}
}
Quit Claude Desktop (⌘Q on macOS) and relaunch. That's it.
[!NOTE] On first boot the server auto-indexes your vault and downloads a ~34 MB embedding model. Tools may take 30–60 s to appear in the client. Subsequent boots are instant.
For every other MCP client (Claude Code, Cursor, VS Code, Jan, Windsurf, Cline, Zed, LM Studio, JetBrains AI, Opencode, Codex CLI, Gemini CLI, Warp): see Install in your MCP client.
→ Full env-var reference: Configuration → Model / preset / Ollama details: Embedding model → Migrating from aaronsb's plugin: Migration guide
What you get
17 MCP tools grouped by intent:
- Find & read —
search,list_notes,read_note - Understand the graph —
find_connections,find_path_between,detect_themes,rank_notes - Write —
create_note,edit_note,apply_edit_preview,link_notes,move_note,delete_note - Live editor (requires companion plugin) —
active_note,dataview_query,base_query - Maintenance —
reindex
→ Arguments, examples, and response shapes: Tool reference
How it works
flowchart LR
Client["<b>MCP Client</b><br/>Claude Desktop · Claude Code<br/>Cursor · Jan · Windsurf · ..."]
subgraph OB ["obsidian-brain (Node process)"]
direction TB
SQL["<b>SQLite index</b><br/>nodes · edges<br/>FTS5 · vec0 embeddings"]
Vault["<b>Vault on disk</b><br/>your .md files"]
Vault -->|"parse + embed"| SQL
SQL -.->|"writes"| Vault
end
Client <-->|"stdio JSON-RPC"| OB
Retrieval and writes both go through a SQLite index: reads are microsecond-cheap, writes land on disk immediately and incrementally re-index the affected file. Embeddings are chunk-level (heading-aware recursive chunker preserving code + LaTeX blocks), and search's default hybrid mode fuses chunk-level semantic rank with FTS5 BM25 via Reciprocal Rank Fusion.
→ Deeper write-up — why stdio, why SQLite, why local embeddings: Architecture → Live watcher behaviour + debounces: Live updates → Scheduled reindex (macOS launchd / Linux systemd): Scheduled indexing (macOS) · (Linux)
Companion plugin (optional)
An optional Obsidian plugin at sweir1/obsidian-brain-plugin exposes live Obsidian runtime state — active editor, Dataview results, Bases rows — over a localhost HTTP endpoint. When installed and Obsidian is running, active_note, dataview_query, and base_query light up. Install via BRAT with repo ID sweir1/obsidian-brain-plugin.
Ship plugin and server at the same major.minor (server 1.6.x ↔ plugin 1.6.x). Patch-version drift is fine.
→ Security model, capability handshake, Dataview / Bases feature coverage: Companion plugin
Troubleshooting
Three most common:
- "Connector has no tools available" in Claude Desktop — usually the server crashed at startup. Check
~/Library/Logs/Claude/mcp-server-obsidian-brain.log. Fix:npm install -g obsidian-brain@latest, quit Claude (⌘Q), relaunch. ERR_DLOPEN_FAILED/NODE_MODULE_VERSIONmismatch —better-sqlite3built against a different Node ABI. Fix:PATH=/opt/homebrew/bin:$PATH npm rebuild -g better-sqlite3.Vault path not configured—VAULT_PATHis unset. Set it in theenvblock of your client config or shell.
→ Full troubleshooting guide (watcher not firing, stale index, running multiple clients, timeouts, embedding-dim mismatch, log locations): docs/troubleshooting.md
Recent releases
- v1.6.0 — agentic-writes safety:
dryRunpreviews on write tools, newapply_edit_preview(previewId)tool, bulkedits[]arrays,fuzzyThresholdtuning,from_bufferrecovery. - v1.5.8 — stub-lifecycle fixes (
move_note/delete_noteno longer orphan stubs; forward-refs upgrade when the real note is created), FTS5 hyphen-query crash fix,search({mode:'hybrid', unique:'chunks'})now returns chunk metadata. - v1.5.0 — Ollama embedding provider,
next_actionsresponse envelope,move_noteinbound-link rewriting, graph-analytics credibility guards. - v1.4.0 — chunk-level embeddings, hybrid RRF as default, pluggable
Embedderinterface, Louvain community detection, PageRank, Obsidian Bases via companion plugin.
→ Full changelog: docs/CHANGELOG.md · Forward plan: docs/roadmap.md · Build from source: docs/development.md
Credits
Thanks to obra/knowledge-graph and aaronsb/obsidian-mcp-plugin for the ideas and code this project draws on. Also Xenova/transformers.js (local embeddings), graphology (graph analytics), and sqlite-vec (vector search in SQLite).
License
MIT — see LICENSE.
Verwandte Server
Claude Auto-Approve MCP
Adds auto-approve functionality for MCP requests to the Claude Desktop application.
Anamnese
Portable, cloud-hosted AI memory you own - structured memories, tasks, goals, and notes that work across Claude, ChatGPT, Gemini, and any MCP client.
Wiki.js
Integrates with Wiki.js, enabling AI to read and update documentation.
arvo-mcp
AI workout coach MCP server. Access training data, workout history, personal records, and body progress through Claude Desktop.
ProPresenter 7 MCP Server
ProPresenter 7 MCP Server
Odoo
Integrate Odoo with Large Language Models (LLMs) for enhanced business process automation.
Skolverket-MCP
MCP server for Swedish National Agency for Education (Skolverket) open data.
Todoist MCP
Interact with your Todoist account to manage tasks and projects.
Kibela
An MCP server for integrating with the Kibela API, allowing LLMs to access and manage content on the knowledge-sharing platform.
Divide and Conquer
Breaks down complex tasks into manageable pieces and stores them in structured JSON.