notebooklm-mcp-cli
Programmatic access to Google NotebookLM — via command-line interface (CLI) or Model Context Protocol (MCP) server.
MCP Guide
Complete reference for the NotebookLM MCP server — 35 tools for AI assistants.
Installation
# Install the package
uv tool install notebooklm-mcp-cli
# Add to Claude Code
claude mcp add --scope user notebooklm-mcp notebooklm-mcp
# Add to Gemini CLI
gemini mcp add --scope user notebooklm-mcp notebooklm-mcp
Authentication
Before using MCP tools, authenticate:
nlm login
Or use the standalone auth tool:
nlm login
Tool Reference
Notebooks (6 tools)
| Tool | Description | |------|-------------| | notebook_list | List all notebooks | | notebook_create | Create new notebook | | notebook_get | Get notebook details with sources | | notebook_describe | Get AI summary and suggested topics | | notebook_rename | Rename a notebook | | notebook_delete | Delete notebook (requires confirm=True) |
Sources (6 tools)
| Tool | Description | |------|-------------| | source_add | Unified - Add URL, text, file, or Drive source | | source_list_drive | List sources with Drive freshness status | | source_sync_drive | Sync stale Drive sources | | source_delete | Delete source (requires confirm=True) | | source_describe | Get AI summary with keywords | | source_get_content | Get raw text content |
source_add parameters:
source_add(
notebook_id="...",
source_type="url", # url | text | file | drive
url="https://...", # for source_type=url
text="...", # for source_type=text
title="...", # optional title
file_path="/path/to.pdf", # for source_type=file
document_id="...", # for source_type=drive
doc_type="doc", # doc | slides | sheets | pdf
wait=True, # wait for processing to complete
wait_timeout=120.0 # seconds to wait
)
Querying (2 tools)
| Tool | Description | |------|-------------| | notebook_query | Ask AI about sources in notebook | | chat_configure | Set chat goal and response length |
Studio Content (4 tools)
| Tool | Description | |------|-------------| | studio_create | Unified - Create any artifact type | | studio_status | Check generation progress | | studio_delete | Delete artifact (requires confirm=True) | | studio_revise | Revise slides in existing deck (requires confirm=True) |
studio_create artifact types:
audio- Podcast (formats: deep_dive, brief, critique, debate)video- Video overview (formats: explainer, brief)report- Text report (Briefing Doc, Study Guide, Blog Post)quiz- Multiple choice quizflashcards- Study flashcardsmind_map- Visual mind mapslide_deck- Presentation slidesinfographic- Visual infographicdata_table- Structured data table
Downloads (1 tool)
| Tool | Description | |------|-------------| | download_artifact | Unified - Download any artifact type |
download_artifact types: audio, video, report, mind_map, slide_deck, infographic, data_table, quiz, flashcards
Exports (1 tool)
| Tool | Description | |------|-------------| | export_artifact | Export to Google Docs/Sheets |
Research (3 tools)
| Tool | Description | |------|-------------| | research_start | Start web/Drive research | | research_status | Poll research progress | | research_import | Import discovered sources (timeout param for large notebooks) |
Notes (1 unified tool)
| Tool | Description | |------|-------------| | note | Unified - Manage notes (action: list, create, update, delete) |
note actions:
note(notebook_id, action="list") # List all notes
note(notebook_id, action="create", content="...", title="...")
note(notebook_id, action="update", note_id="...", content="...")
note(notebook_id, action="delete", note_id="...", confirm=True)
Sharing (3 tools)
| Tool | Description | |------|-------------| | notebook_share_status | Get sharing settings | | notebook_share_public | Enable/disable public link | | notebook_share_invite | Invite collaborator by email |
Auth (2 tools)
| Tool | Description | |------|-------------| | refresh_auth | Reload auth tokens | | save_auth_tokens | Save cookies (fallback method) |
Server (1 tool)
| Tool | Description | |------|-------------| | server_info | Get version and check for updates |
Batch & Cross-Notebook (2 tools)
| Tool | Description | |------|-------------| | batch | Unified — Batch operations across multiple notebooks (action: query, add_source, create, delete, studio) | | cross_notebook_query | Query multiple notebooks and get aggregated answers with per-notebook citations |
batch actions:
batch(action="query", query="What are the key findings?", notebook_names="AI Research, Dev Tools")
batch(action="add_source", source_url="https://...", tags="ai,research")
batch(action="create", titles="Project A, Project B, Project C")
batch(action="delete", notebook_names="Old Project", confirm=True)
batch(action="studio", artifact_type="audio", tags="research", confirm=True)
cross_notebook_query:
cross_notebook_query(query="Compare approaches", notebook_names="Notebook A, Notebook B")
cross_notebook_query(query="Summarize", tags="ai,research")
cross_notebook_query(query="Everything", all=True)
Pipelines (1 tool)
| Tool | Description | |------|-------------| | pipeline | Unified — List or run multi-step workflows (action: list, run) |
pipeline actions:
pipeline(action="list") # List available pipelines
pipeline(action="run", notebook_id="...", pipeline_name="ingest-and-podcast", input_url="https://...")
Built-in pipelines: ingest-and-podcast, research-and-report, multi-format
Tags & Smart Select (1 tool)
| Tool | Description | |------|-------------| | tag | Unified — Tag notebooks and find relevant ones (action: add, remove, list, select) |
tag actions:
tag(action="add", notebook_id="...", tags="ai,research,llm")
tag(action="remove", notebook_id="...", tags="ai")
tag(action="list") # List all tagged notebooks
tag(action="select", query="ai research") # Find notebooks by tag match
Example Workflows
Research → Podcast
1. research_start(query="AI trends 2026", mode="deep")
2. research_status(notebook_id, max_wait=300) # wait for completion
3. research_import(notebook_id, task_id, timeout=600) # optional: increase for large notebooks
4. studio_create(notebook_id, artifact_type="audio", confirm=True)
5. studio_status(notebook_id) # poll until complete
6. download_artifact(notebook_id, artifact_type="audio", output_path="podcast.mp3")
Add Sources with Wait
source_add(notebook_id, source_type="url", url="https://...", wait=True)
# Returns when source is fully processed and ready for queries
Generate Study Materials
studio_create(notebook_id, artifact_type="quiz", question_count=10, confirm=True)
studio_create(notebook_id, artifact_type="flashcards", difficulty="hard", confirm=True)
studio_create(notebook_id, artifact_type="report", report_format="Study Guide", confirm=True)
Tag, Batch & Cross-Notebook
# Tag notebooks for organization
tag(action="add", notebook_id="abc", tags="ai,research")
tag(action="add", notebook_id="def", tags="ai,product")
# Find relevant notebooks
tag(action="select", query="ai research")
# Query across tagged notebooks
cross_notebook_query(query="What are the main conclusions?", tags="ai")
# Batch generate podcasts for all tagged notebooks
batch(action="studio", artifact_type="audio", tags="ai", confirm=True)
Pipeline Automation
# List available pipelines
pipeline(action="list")
# Run a full ingest-and-podcast workflow
pipeline(action="run", notebook_id="abc", pipeline_name="ingest-and-podcast", input_url="https://example.com")
Configuration
MCP Server Options
| Flag | Description | Default | |------|-------------|---------| | --transport | Protocol (stdio, http, sse) | stdio | | --port | Port for HTTP/SSE | 8000 | | --debug | Enable verbose logging | false |
Environment Variables
| Variable | Description | |----------|-------------| | NOTEBOOKLM_MCP_TRANSPORT | Transport type | | NOTEBOOKLM_MCP_PORT | HTTP/SSE port | | NOTEBOOKLM_MCP_DEBUG | Enable debug logging | | NOTEBOOKLM_HL | Interface language and default artifact language (default: en) | | NOTEBOOKLM_QUERY_TIMEOUT | Query timeout (seconds) | | NOTEBOOKLM_BASE_URL | Override base URL for Enterprise/Workspace (default: https://notebooklm.google.com) |
Context Window Tips
This MCP has 35 tools which consume context. Best practices:
- Disable when not using: In Claude Code, use
@notebooklm-mcpto toggle - Use unified tools:
source_add,studio_create,download_artifacthandle multiple operations each - Poll wisely: Use
studio_statussparingly - artifacts take 1-5 minutes
IDE Configuration
The easiest way to configure any tool is with nlm setup:
nlm setup add claude-code # Claude Code
nlm setup add gemini # Gemini CLI
nlm setup add cursor # Cursor
nlm setup add windsurf # Windsurf
nlm setup add json # Any other tool (interactive JSON generator)
Manual configuration
Claude Code
claude mcp add --scope user notebooklm-mcp notebooklm-mcp
Cursor / VS Code
Add to ~/.cursor/mcp.json or ~/.vscode/mcp.json:
{
"mcpServers": {
"notebooklm-mcp": {
"command": "/path/to/notebooklm-mcp"
}
}
}
Gemini CLI
gemini mcp add --scope user notebooklm-mcp notebooklm-mcp
相關伺服器
Follow Plan
Track and manage AI implementation plans.
agent-reader
Glama AAA-certified MCP server for document beautification. It bridges the "last mile" of AI content delivery by instantly converting Markdown into professional Word, PDF, HTML, and Slideshows.
mcp-todo
A simple to-do list manager to record, track, and complete daily tasks.
Linear
Query and search for issues in Linear, a project management tool.
Clanki - Claude's Anki Integration
Enables AI assistants to interact with Anki flashcard decks via the AnkiConnect plugin.
Asana
Interact with the Asana API to manage tasks, projects, and workspaces.
Planka
Interact with Planka, a Trello-like kanban board, to manage projects, boards, and cards. Requires Planka server URL and credentials.
CV Forge MCP
Forge powerful, ATS-friendly CVs tailored to any job - an MCP server for intelligent CV generation
UnifAI
Dynamically search and call tools using UnifAI Network
Multi-Carrier Shipping API — powered by Secureship
Secureship MCP gives AI assistants access to a multi-carrier shipping API covering rate comparison, label generation, package tracking, pickup scheduling, address book management, shipment history, customs documents, and more — across carriers like UPS, FedEx, Purolator, Canpar, and others. Browse 150+ live endpoint schemas, parameters, and auth details — always current, never stale.