notebooklm-mcp-secure
Security-hardened NotebookLM MCP with post-quantum encryption, GDPR/SOC2 compliance, and 14 security layers. Query Google's Gemini-grounded research from any MCP-compatible AI assistant.
NotebookLM MCP Server (Security Hardened)
π The World's Most Advanced NotebookLM MCP Server
Zero-hallucination answers β’ Gemini Deep Research β’ 14 Security Layers β’ Enterprise Compliance
What's New 2026 β’ Deep Research β’ Document API β’ Create Notebooks β’ Security β’ Install
The only NotebookLM MCP with enterprise-grade security, post-quantum encryption, and full Gemini API integration.
Security-hardened fork of PleasePrompto/notebooklm-mcp β’ Maintained by Pantheon Security
β‘ TL;DR β What You Get
- π Query your NotebookLM notebooks β source-grounded, zero-hallucination answers
- π Create & manage notebooks programmatically β no manual clicking
- ποΈ Generate audio overviews β podcast-style summaries of your docs
- π¬ Gemini Deep Research β comprehensive multi-source research (optional API)
- π Document API β upload & query PDFs without browser (optional API)
- π 14 security layers β post-quantum encryption, audit logs, secrets scanning
- β Enterprise compliance β GDPR, SOC2, CSSF ready
- π‘ No API key required β core features work with just browser auth
π What's New in 2026
Latest: v2026.1.8 β Updated dependencies, post-quantum crypto improvements
| Version | Highlights |
|---|---|
| v2026.1.8 | Major dependency updates (zod 4.x, dotenv 17.x, post-quantum 0.5.4) |
| v2026.1.7 | MCP Protocol UX: tool icons, human-friendly titles, behavior annotations |
| v2026.1.4 | Defense-in-depth path validation, security hardening |
| v2026.1.1 | Deep health checks, chat history extraction, context management |
# Quick install
claude mcp add notebooklm -- npx @pan-sec/notebooklm-mcp@latest
Why Choose This MCP?
| Capability | Other MCPs | This MCP |
|---|---|---|
| Query NotebookLM | β Basic | β + session management, quotas |
| Create notebooks programmatically | β | β UNIQUE |
| Gemini Deep Research | β | β EXCLUSIVE |
| Document API (no browser) | β | β EXCLUSIVE |
| Post-quantum encryption | β | β Future-proof |
| Enterprise compliance | β | β GDPR/SOC2/CSSF |
| Chat history extraction | β | β NEW |
| Deep health verification | β | β NEW |
Core NotebookLM (No API Key Required)
| Tool | Description |
|---|---|
ask_question | Query notebooks with source-grounded answers |
add_notebook | Add a notebook to your library |
list_notebooks | List all notebooks in library |
select_notebook | Set active notebook |
update_notebook | Update notebook metadata |
remove_notebook | Remove from library |
create_notebook | Programmatically create new notebooks |
batch_create_notebooks | Create multiple notebooks at once |
sync_library | Sync library with NotebookLM |
list_sources | List sources in a notebook |
add_source | Add source to notebook |
remove_source | Remove source from notebook |
generate_audio_overview | Create podcast-style audio |
get_audio_status | Check audio generation status |
download_audio | Download generated audio |
list_sessions | List active sessions |
close_session | Close a session |
reset_session | Reset session history |
get_health | Check server & auth status |
setup_auth | Initial authentication |
re_auth | Re-authenticate |
cleanup_data | Clean up local data |
get_quota | Check usage quotas |
set_quota_tier | Set quota tier |
get_query_history | View past queries |
get_notebook_chat_history | Extract browser chat history |
get_project_info | Get project context |
export_library | Export library backup |
Gemini API (Optional - Requires GEMINI_API_KEY)
| Tool | Description |
|---|---|
deep_research | Comprehensive research agent |
gemini_query | Fast grounded queries |
get_research_status | Check research progress |
upload_document | Upload docs to Gemini |
query_document | Query uploaded documents |
query_chunked_document | Query large documents |
list_documents | List uploaded documents |
delete_document | Delete uploaded document |
Webhooks & Integrations
| Tool | Description |
|---|---|
configure_webhook | Set up webhook notifications |
list_webhooks | List configured webhooks |
test_webhook | Test webhook delivery |
remove_webhook | Remove a webhook |
Enterprise Compliance (16 additional tools)
See Compliance Documentation for full list.
Gemini Deep Research (v1.8.0)
The most powerful research capability for AI agents β now in your MCP toolkit.
v1.8.0 introduces the Gemini Interactions API as a stable, API-based research backend alongside browser automation. This gives your agents access to Google's state-of-the-art Deep Research agent.
Why This Matters
| Challenge | Solution |
|---|---|
| Browser UI changes break automation | Gemini API is stable and versioned |
| Need comprehensive research but no research agent | Deep Research agent does it for you |
| Want current information with citations | Google Search grounding built-in |
| Need reliable, fast queries | API-based = no UI dependencies |
New Tools
deep_research β Comprehensive Research Agent
"Research the security implications of post-quantum cryptography adoption in financial services"
- Runs Google's Deep Research agent (same as Gemini Advanced)
- Takes 1-5 minutes for comprehensive, web-grounded analysis
- Returns structured answers with citations and sources
- Perfect for complex topics requiring multi-source synthesis
gemini_query β Fast Grounded Queries
"What are the latest CVEs for Log4j in 2025?" (with Google Search)
"Calculate the compound interest on $10,000 at 5% over 10 years" (with code execution)
"Summarize this security advisory: [URL]" (with URL context)
- Google Search grounding β Current information, not just training data
- Code execution β Run calculations, data analysis
- URL context β Analyze web pages on demand
- Models:
gemini-2.5-flash(fast),gemini-2.5-pro(powerful),gemini-3-flash-preview(latest)
get_research_status β Background Task Monitoring
Run deep research in the background and check progress:
"Start researching [topic] in the background"
... continue other work ...
"Check research status for interaction_abc123"
Hybrid Architecture
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β NotebookLM MCP Server v2026.1.x β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β ββββββββββββββββββββββββββββββββββ ββββββββββββββββββββββββββββββββββββ β
β β BROWSER AUTOMATION β β GEMINI API β β
β β β
NO API KEY NEEDED β β β‘ OPTIONAL - needs API key β β
β ββββββββββββββββββββββββββββββββββ€ ββββββββββββββββββββββββββββββββββββ€ β
β β β β β β
β β QUERY β β RESEARCH β β
β β β’ ask_question β β β’ deep_research β β
β β β’ get_notebook_chat_history β β β’ gemini_query β β
β β β β β’ get_research_status β β
β β CREATE & MANAGE β β β β
β β β’ create_notebook β β DOCUMENTS β β
β β β’ batch_create_notebooks β β β’ upload_document β β
β β β’ manage_sources β β β’ query_document β β
β β β’ generate_audio β β β’ query_chunked_document β β
β β β’ sync_notebook β β β’ list/delete_document β β
β β β β β β
β β HEALTH & SESSIONS v2026 β β β β
β β β’ get_health (deep_check) β β Fast API β’ 48h retention β β
β β β’ get_query_history β β Auto-chunking for large PDFs β β
β ββββββββββββββββββββββββββββββββββ ββββββββββββββββββββββββββββββββββββ β
β β
β βββββββββββββββββββββββββββββββββββ β
β β 14 SECURITY LAYERS β β
β β Post-Quantum β’ Audit Logs β β
β β Cert Pinning β’ Memory Wipe β β
β β GDPR β’ SOC2 β’ CSSF Ready β β
β βββββββββββββββββββββββββββββββββββ β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
π‘ Gemini API is completely optional! All core NotebookLM features (ask_question, notebooks, sessions, audio) work via browser automation with no API key required. The Gemini tools below are bonus features for users who want direct API access.
Gemini Configuration (Optional)
# Only required if you want Gemini API features (deep_research, gemini_query, upload_document)
GEMINI_API_KEY=your-api-key # Get from https://aistudio.google.com/apikey
# Optional settings
GEMINI_DEFAULT_MODEL=gemini-2.5-flash # Default model
GEMINI_DEEP_RESEARCH_ENABLED=true # Enable Deep Research
GEMINI_TIMEOUT_MS=30000 # API timeout
When to Use Which
| Task | Best Tool | Why |
|---|---|---|
| Questions about YOUR documents | ask_question | Grounded on your uploaded sources |
| Comprehensive topic research | deep_research | Multi-source synthesis with citations |
| Current events / recent info | gemini_query + google_search | Live web data |
| Code calculations | gemini_query + code_execution | Reliable computation |
| Analyze a webpage | gemini_query + url_context | Direct page analysis |
| Quick PDF/document analysis | upload_document + query_document | Fast API, no browser (NEW!) |
π Document API (v1.9.0)
Upload and query documents directly via Gemini API β no browser automation needed.
v1.9.0 introduces the Gemini Files API for fast, reliable document analysis. Upload PDFs, analyze them instantly, and delete when done.
Why This Matters
| Feature | Browser Mode | Document API |
|---|---|---|
| Speed | Seconds | Milliseconds |
| Reliability | UI-dependent | API-stable |
| File Support | Via NotebookLM | 50MB PDFs, 1000 pages |
| Retention | Permanent | 48 hours |
| Setup | Auth + cookies | Just API key |
New Tools
upload_document β Fast Document Upload
Upload any document to Gemini for instant querying:
Upload /path/to/research-paper.pdf
- Supported: PDF (50MB, 1000 pages), TXT, MD, HTML, CSV, JSON, DOCX, images, audio, video
- 48-hour retention β files auto-expire, or delete manually
- Returns a file ID for querying
query_document β Ask Questions About Documents
"What are the main findings in this research paper?"
"Summarize section 3 of the document"
"Extract all statistics mentioned in the PDF"
- Full document understanding (text, tables, charts, diagrams)
- Multi-document queries (compare multiple files)
- Fast API response (no browser wait)
list_documents β See All Uploaded Files
List all my uploaded documents
Shows file names, sizes, MIME types, and expiration times.
delete_document β Clean Up Sensitive Files
Delete file xyz123
Immediately remove files (don't wait for 48h expiration).
Workflow Example
1. upload_document("/research/paper.pdf")
β Returns: files/abc123
2. query_document("files/abc123", "What methodology was used?")
β Returns: "The paper uses a mixed-methods approach combining..."
3. query_document("files/abc123", "List all cited authors")
β Returns: "Smith et al. (2024), Johnson (2023)..."
4. delete_document("files/abc123")
β File removed
Auto-Chunking for Large PDFs (v1.10.0)
No file size limits β PDFs of any size are automatically handled.
When you upload a PDF that exceeds Gemini's limits (50MB or 1000 pages), the system automatically:
- Detects the oversized PDF
- Splits it into optimal chunks (500 pages each)
- Uploads all chunks in parallel
- Returns chunk metadata for querying
upload_document("/research/massive-2000-page-report.pdf")
β Returns:
{
"wasChunked": true,
"totalPages": 2000,
"chunks": [
{ "fileName": "files/abc1", "pageStart": 1, "pageEnd": 500 },
{ "fileName": "files/abc2", "pageStart": 501, "pageEnd": 1000 },
{ "fileName": "files/abc3", "pageStart": 1001, "pageEnd": 1500 },
{ "fileName": "files/abc4", "pageStart": 1501, "pageEnd": 2000 }
],
"allFileNames": ["files/abc1", "files/abc2", "files/abc3", "files/abc4"]
}
query_chunked_document β Query All Chunks at Once
For chunked documents, use this tool to query all parts and get an aggregated answer:
query_chunked_document(
file_names: ["files/abc1", "files/abc2", "files/abc3", "files/abc4"],
query: "What are the key recommendations in this report?"
)
β Queries each chunk, then synthesizes a unified answer
When to Use Document API vs NotebookLM
| Scenario | Use |
|---|---|
| Quick one-off document analysis | Document API β fast, no setup |
| Building a permanent knowledge base | NotebookLM β permanent storage |
| Analyzing sensitive documents | Document API β 48h auto-delete |
| Multi-source research over time | NotebookLM β organized notebooks |
| CI/CD pipeline document processing | Document API β API-native |
| Large PDFs (1000+ pages) | Document API β auto-chunking |
Programmatic Notebook Creation (v1.7.0+)
Create NotebookLM notebooks entirely from code β no manual clicks required.
Most MCP servers can only read from NotebookLM. This one can create notebooks, add sources, and generate audio β all programmatically.
create_notebook β Build Notebooks Instantly
Create a complete notebook with multiple sources in one command:
{
"name": "Security Research 2025",
"sources": [
{ "type": "url", "value": "https://owasp.org/Top10" },
{ "type": "file", "value": "/path/to/security-report.pdf" },
{ "type": "text", "value": "Custom analysis notes...", "title": "My Notes" }
],
"description": "OWASP security best practices",
"topics": ["security", "owasp", "vulnerabilities"]
}
Supported source types:
- URL β Web pages, documentation, articles
- File β PDF, DOCX, TXT, and more
- Text β Raw text, code snippets, notes
batch_create_notebooks β Scale Up
Create up to 10 notebooks in a single operation:
{
"notebooks": [
{ "name": "React Docs", "sources": [{ "type": "url", "value": "https://react.dev/reference" }] },
{ "name": "Node.js API", "sources": [{ "type": "url", "value": "https://nodejs.org/api/" }] },
{ "name": "TypeScript Handbook", "sources": [{ "type": "url", "value": "https://www.typescriptlang.org/docs/" }] }
]
}
Perfect for:
- Setting up project documentation libraries
- Onboarding new team members with curated knowledge bases
- Creating topic-specific research notebooks in bulk
manage_sources β Dynamic Source Management
Add or remove sources from existing notebooks:
{
"notebook_id": "abc123",
"action": "add",
"sources": [{ "type": "url", "value": "https://new-documentation.com" }]
}
generate_audio β Audio Overview Creation
Generate NotebookLM's famous "Audio Overview" podcasts programmatically:
"Generate an audio overview for my Security Research notebook"
sync_notebook β Keep Sources Updated
Sync notebook sources from a local directory:
{
"notebook_id": "abc123",
"directory": "/path/to/docs",
"patterns": ["*.md", "*.pdf"]
}
Why This Matters
| Traditional Workflow | With This MCP |
|---|---|
| Manually create notebook in browser | create_notebook β done |
| Click "Add source" for each document | Batch add in single command |
| Navigate UI to generate audio | generate_audio β podcast ready |
| Update sources by hand | sync_notebook from local files |
Your agent can now build entire knowledge bases autonomously.
π Query History & Chat Extraction (v2026.1.0)
Track your research and recover conversations from NotebookLM notebooks.
get_query_history β Review Past Research (v1.10.8)
All queries made through the MCP are automatically logged for review:
"Show me my recent NotebookLM queries"
"Find queries about security from last week"
"What did I ask the fine-tuning notebook?"
- Automatic logging β every Q&A pair saved with metadata
- Search β find specific topics across all queries
- Filter β by notebook, session, or date
- Quota tracking β see query counts and timing
get_notebook_chat_history β Extract Browser Conversations (v2026.1.0)
Extract conversation history directly from a NotebookLM notebook's chat UI with context management to avoid overwhelming your AI context window:
Quick audit (preview mode):
{ "notebook_id": "my-research", "preview_only": true }
Returns message counts without content β test the water before extracting.
Export to file (avoids context overflow):
{ "notebook_id": "my-research", "output_file": "/tmp/chat-history.json" }
Dumps full history to disk instead of returning to context.
Paginate through history:
{ "notebook_id": "my-research", "limit": 20, "offset": 0 }
{ "notebook_id": "my-research", "limit": 20, "offset": 20 }
Page through large histories without loading everything at once.
Returns:
{
"notebook_url": "https://notebooklm.google.com/notebook/xxx",
"notebook_name": "My Research",
"total_messages": 150,
"returned_messages": 40,
"user_messages": 75,
"assistant_messages": 75,
"offset": 0,
"has_more": true,
"messages": [...]
}
Use cases:
- Recover conversations made directly in the NotebookLM browser (not tracked by MCP)
- Audit research β see what queries were made in a notebook
- Resume context β pick up where a previous session left off
- Quota reconciliation β understand why quota seems off
Why This Fork?
The original NotebookLM MCP is excellent for productivity β but MCP servers handle sensitive data:
- Browser sessions with Google authentication
- Cookies and tokens stored on disk
- Query history that may contain proprietary information
This fork adds 14 security hardening layers to protect that data.
Security Features
| Layer | Feature | Protection |
|---|---|---|
| π | Post-Quantum Encryption | ML-KEM-768 + ChaCha20-Poly1305 hybrid |
| π | Secrets Scanning | Detects 30+ credential patterns (AWS, GitHub, Slack...) |
| π | Certificate Pinning | Blocks MITM attacks on Google connections |
| π§Ή | Memory Scrubbing | Zeros sensitive data after use |
| π | Audit Logging | Tamper-evident logs with hash chains |
| β±οΈ | Session Timeout | 8h hard limit + 30m inactivity auto-logout |
| π« | MCP Authentication | Token-based auth with brute-force lockout |
| π‘οΈ | Response Validation | Detects prompt injection attempts |
| β | Input Validation | URL whitelisting, sanitization |
| π¦ | Rate Limiting | Per-session request throttling |
| π | Log Sanitization | Credentials masked in all output |
| π | MEDUSA Integration | Automated security scanning |
| π₯οΈ | Cross-Platform | Native support for Linux, macOS, Windows |
Post-Quantum Ready
Traditional encryption (RSA, ECDH) will be broken by quantum computers. This fork uses hybrid encryption:
ML-KEM-768 (Kyber) + ChaCha20-Poly1305
- ML-KEM-768: NIST-standardized post-quantum key encapsulation
- ChaCha20-Poly1305: Modern stream cipher (immune to timing attacks)
Even if one algorithm is broken, the other remains secure.
Cross-Platform Support
Full native support for all major operating systems:
| Platform | File Permissions | Data Directory |
|---|---|---|
| Linux | Unix chmod (0o600/0o700) | ~/.local/share/notebooklm-mcp/ |
| macOS | Unix chmod (0o600/0o700) | ~/Library/Application Support/notebooklm-mcp/ |
| Windows | ACLs via icacls (current user only) | %LOCALAPPDATA%\notebooklm-mcp\ |
All sensitive files (encryption keys, auth tokens, audit logs) are automatically protected with owner-only permissions on every platform.
Enterprise Compliance (v1.6.0+)
Full compliance support for regulated industries:
| Regulation | Features |
|---|---|
| GDPR | Consent management, DSAR handling, right to erasure, data portability |
| SOC2 Type II | Hash-chained audit logs, incident response, availability monitoring |
| CSSF | 7-year retention, SIEM integration, policy documentation |
Compliance Tools (16 MCP tools)
compliance_dashboard - Real-time compliance status
compliance_report - Generate audit reports (JSON/CSV/HTML)
compliance_evidence - Collect evidence packages
grant_consent - Record user consent
submit_dsar - Handle data subject requests
request_erasure - Right to be forgotten
export_user_data - Data portability export
create_incident - Security incident management
...and 8 more
See COMPLIANCE-SPEC.md for full documentation.
Installation
What Works Out of the Box (No API Key)
All core NotebookLM features work immediately with just browser authentication:
| Feature | Tool | Description |
|---|---|---|
| π Query notebooks | ask_question | Get source-grounded answers from your documents |
| π Manage library | add_notebook, list_notebooks, etc. | Organize your notebook collection |
| ποΈ Audio overviews | generate_audio_overview | Create podcast-style summaries |
| π Create notebooks | create_notebook | Programmatically create new notebooks |
| π Session management | list_sessions, reset_session | Manage conversation context |
| π Chat history | get_notebook_chat_history | Extract past conversations |
| β€οΈ Health checks | get_health | Verify authentication status |
Optional: Add GEMINI_API_KEY for bonus features like deep_research, gemini_query, and upload_document.
Claude Code
claude mcp add notebooklm -- npx @pan-sec/notebooklm-mcp@latest
With Authentication + Gemini (Recommended)
claude mcp add notebooklm \
--env NLMCP_AUTH_ENABLED=true \
--env NLMCP_AUTH_TOKEN=$(openssl rand -base64 32) \
--env GEMINI_API_KEY=your-gemini-api-key \
-- npx @pan-sec/notebooklm-mcp@latest
Codex
codex mcp add notebooklm -- npx @pan-sec/notebooklm-mcp@latest
Add to ~/.cursor/mcp.json:
{
"mcpServers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "@pan-sec/notebooklm-mcp@latest"],
"env": {
"NLMCP_AUTH_ENABLED": "true",
"NLMCP_AUTH_TOKEN": "your-secure-token",
"GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}
Add to ~/.gemini/antigravity/mcp_config.json (macOS/Linux) or %USERPROFILE%\.gemini\antigravity\mcp_config.json (Windows):
{
"mcpServers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "@pan-sec/notebooklm-mcp@latest"]
}
}
}
With optional env vars:
{
"mcpServers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "@pan-sec/notebooklm-mcp@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}
Note: Antigravity does NOT support
${workspaceFolder}variables. Use absolute paths.
Add to ~/.config/opencode/opencode.json (global) or opencode.json in project root:
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"notebooklm": {
"type": "local",
"command": ["npx", "-y", "@pan-sec/notebooklm-mcp@latest"],
"enabled": true,
"environment": {
"GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}
Note: OpenCode uses
"mcp"(not"mcpServers") and"command"is an array.
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "@pan-sec/notebooklm-mcp@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}
Add to your VS Code settings.json:
{
"mcp": {
"servers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "@pan-sec/notebooklm-mcp@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}
}
Most MCP clients use this standard format:
{
"mcpServers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "@pan-sec/notebooklm-mcp@latest"],
"env": {
"GEMINI_API_KEY": "your-gemini-api-key"
}
}
}
}
Common config locations:
| Client | Config File |
|---|---|
| Claude Desktop | ~/.config/claude/claude_desktop_config.json |
| Cursor | ~/.cursor/mcp.json |
| Antigravity | ~/.gemini/antigravity/mcp_config.json |
| OpenCode | ~/.config/opencode/opencode.json |
| Windsurf | ~/.codeium/windsurf/mcp_config.json |
Quick Start
1. Install (see above)
2. Authenticate
"Log me in to NotebookLM"
Chrome opens β sign in with Google
3. Add your notebook
Go to notebooklm.google.com β Create notebook β Upload docs β Share link
4. Use it
"Research [topic] using this NotebookLM: [link]"
5. Try Deep Research (NEW!)
"Use deep research to investigate [complex topic]"
Complete Tool Reference
Research Tools
| Tool | Description | Backend |
|---|---|---|
ask_question | Query your NotebookLM notebooks | Browser |
deep_research | Comprehensive research with citations | Gemini API |
gemini_query | Fast queries with grounding tools | Gemini API |
get_research_status | Check background research progress | Gemini API |
Notebook Management
| Tool | Description |
|---|---|
add_notebook | Add notebook to library |
list_notebooks | List all notebooks |
get_notebook | Get notebook details |
update_notebook | Update notebook metadata |
remove_notebook | Remove from library |
select_notebook | Set active notebook |
search_notebooks | Search by query |
Source Management (v1.7.0+)
| Tool | Description |
|---|---|
manage_sources | Add/remove/list sources |
generate_audio | Create Audio Overview |
sync_notebook | Sync sources from local files |
Session & System
| Tool | Description |
|---|---|
list_sessions | View active sessions |
close_session | Close a session |
reset_session | Reset session chat |
get_health | Server health check (with deep_check for UI verification) |
get_query_history | Review past queries with search/filter |
get_notebook_chat_history | Extract browser conversations (pagination, file export) |
setup_auth | Initial authentication |
re_auth | Re-authenticate |
cleanup_data | Deep cleanup utility |
get_library_stats | Library statistics |
get_quota | Check usage limits and remaining quota |
Compliance (v1.6.0+)
16 compliance tools for GDPR, SOC2, and CSSF requirements.
What Gets Protected
| Data | Protection |
|---|---|
| Browser cookies | Post-quantum encrypted at rest |
| Session tokens | Auto-expire + memory scrubbing |
| Query history | Audit logged with tamper detection |
| Google connection | Certificate pinned (MITM blocked) |
| Log output | Credentials auto-redacted |
| API responses | Scanned for leaked secrets |
| Gemini API key | Secure memory handling |
Configuration
All security features are enabled by default. Override via environment variables:
# Authentication
NLMCP_AUTH_ENABLED=true
NLMCP_AUTH_TOKEN=your-secret-token
# Gemini API (v1.8.0+)
GEMINI_API_KEY=your-api-key
GEMINI_DEFAULT_MODEL=gemini-2.5-flash
GEMINI_DEEP_RESEARCH_ENABLED=true
GEMINI_TIMEOUT_MS=30000
# Encryption
NLMCP_USE_POST_QUANTUM=true
NLMCP_ENCRYPTION_KEY=base64-32-bytes # Optional custom key
# Session Limits
NLMCP_SESSION_MAX_LIFETIME=28800 # 8 hours
NLMCP_SESSION_INACTIVITY=1800 # 30 minutes
# Secrets Scanning
NLMCP_SECRETS_SCANNING=true
NLMCP_SECRETS_BLOCK=false # Block on detection
NLMCP_SECRETS_REDACT=true # Auto-redact
# Certificate Pinning
NLMCP_CERT_PINNING=true
# Audit Logging
NLMCP_AUDIT_ENABLED=true
# Multi-Session Support (v2026.1.2+)
NOTEBOOK_PROFILE_STRATEGY=isolated # isolated|single|auto
NOTEBOOK_CLONE_PROFILE=true # Clone auth from base profile
Multi-Session Mode
Run multiple Claude Code sessions simultaneously with isolated browser profiles:
# Add to ~/.bashrc or ~/.zshrc
export NOTEBOOK_PROFILE_STRATEGY=isolated
export NOTEBOOK_CLONE_PROFILE=true
| Variable | Values | Description |
|---|---|---|
NOTEBOOK_PROFILE_STRATEGY | single, auto, isolated | isolated = separate profile per session |
NOTEBOOK_CLONE_PROFILE | true, false | Clone authenticated base profile into isolated instances |
How it works:
- Each session gets its own Chrome profile (no lock conflicts)
- Isolated profiles clone from the authenticated base profile
- Auth coordination ensures cloning waits for any in-progress authentication
See SECURITY.md for complete configuration reference.
Security Scanning
Run MEDUSA security scanner:
npm run security-scan
Or integrate in CI/CD:
- name: Security Scan
run: npx @pan-sec/notebooklm-mcp && npm run security-scan
Comparison
vs Other NotebookLM MCPs
| Feature | Others | @pan-sec/notebooklm-mcp |
|---|---|---|
| Zero-hallucination Q&A | β | β |
| Library management | β | β |
| Create Notebooks Programmatically | β | β EXCLUSIVE |
| Batch Create (10 notebooks) | β | β EXCLUSIVE |
| Gemini Deep Research | β | β EXCLUSIVE |
| Document API (no browser) | β | β EXCLUSIVE |
| Auto-chunking (1000+ page PDFs) | β | β EXCLUSIVE |
| Chat History Extraction | β | β NEW |
| Deep Health Verification | β | β NEW |
| Query History & Search | β | β |
| Quota Management | β | β |
| Source Management (add/remove) | β | β |
| Audio Overview Generation | β | β |
| Sync from Local Directories | β | β |
Security & Compliance (Unique to This Fork)
| Feature | Others | @pan-sec/notebooklm-mcp |
|---|---|---|
| Cross-platform (Linux/macOS/Windows) | β οΈ Partial | β Full |
| Post-quantum encryption | β | β ML-KEM-768 + ChaCha20 |
| Secrets scanning | β | β 30+ patterns |
| Certificate pinning | β | β Google MITM protection |
| Memory scrubbing | β | β Zero-on-free |
| Audit logging | β | β Hash-chained |
| MCP authentication | β | β Token + lockout |
| Prompt injection detection | β | β Response validation |
| GDPR Compliance | β | β Full |
| SOC2 Type II | β | β Full |
| CSSF (Luxembourg) | β | β Full |
Bottom line: If you need more than basic queries, or care about security, there's only one choice.
Version History
| Version | Highlights |
|---|---|
| v2026.1.1 | π Deep health check β verifies NotebookLM chat UI actually loads |
| v2026.1.0 | π Chat history extraction with context management, CalVer versioning |
| v1.10.8 | Query history logging, quota tracking |
| v1.10.0 | Auto-chunking for large PDFs (1000+ pages) |
| v1.9.0 | Document API: upload, query, delete via Gemini Files API |
| v1.8.0 | Gemini Deep Research, Query with Grounding, Background Tasks |
| v1.7.0 | Programmatic notebook creation, batch operations, audio generation |
| v1.6.0 | Enterprise compliance: GDPR, SOC2 Type II, CSSF |
| v1.5.0 | Cross-platform support (Windows ACLs, macOS, Linux) |
| v1.4.0 | Post-quantum encryption, secrets scanning |
Reporting Vulnerabilities
Found a security issue? Do not open a public GitHub issue.
Email: support@pantheonsecurity.io
Credits
- Original MCP Server: GΓ©rΓ΄me Dexheimer β notebooklm-mcp
- Security Hardening: Pantheon Security
- Post-Quantum Crypto: @noble/post-quantum
- Gemini API: Google AI
License
MIT β Same as original.
Security hardened with π by Pantheon Security
Powered by Google Gemini π
Full Security Documentation β’ Compliance Guide β’ Report Vulnerability
Related Servers
Shared Memory MCP
An example project for deploying a remote MCP server on Cloudflare Workers without authentication.
Claude-NWS Protocol Bridge
Integrates the US National Weather Service API to provide real-time weather data and forecasts.
RunPod MCP Server
Interact with the RunPod REST API to manage cloud GPU resources.
DeepSeek
Access DeepSeek's advanced language models via the DeepSeek API.
Google Cloud MCP
Interact with Google Cloud services and manage your cloud resources.
Unsplash MCP Server
An MCP server for accessing the Unsplash API to search for and retrieve photos.
Spotify
Control Spotify playback using natural language commands.
HuggingFace Spaces
Server for using HuggingFace Spaces, supporting Images, Audio, Text and more. Claude Desktop mode for ease-of-use.
Eyevinn Open Source Cloud
Interact with the Eyevinn Open Source Cloud API. Requires a Personal Access Token (OSC_ACCESS_TOKEN).
Crypto Price & Market Analysis
Provides real-time cryptocurrency price data, market analysis, and historical trends using the CoinCap API.