notebooklm
MCP server for NotebookLM - Let your AI agents (Claude Code, Codex) research documentation directly with grounded, citation-backed answers from Gemini. Persistent auth, library management, cross-client sharing. Zero hallucinations, just your knowledge base.
claude mcp add --transport stdio roomi-fields-notebooklm-mcp node /path/to/notebooklm-mcp/dist/index.js \ --env GOOGLE_ACCOUNT_FOR_AUTOMATION="your-google-account@example.com" \ --env GOOGLE_APPLICATION_CREDENTIALS="path/to/credentials.json"
How to use
NotebookLM MCP Server exposes a programmable interface to interact with NotebookLM via the MCP protocol. It supports MCP-based clients (Claude Code, Cursor, Codex, or any MCP client) as well as an HTTP REST API for automation workflows through n8n, Zapier, Make, or custom integrations. Typical usage involves running the Node.js MCP server, then registering it with an MCP client (e.g., via Claude Code) using the server name notebooklm and pointing to the local server's entrypoint. The server enables Q&A with citations from NotebookLM, content generation tasks (audio podcast, video, infographic, reports, presentations, and data tables), and source management for notebooks. You can also use the HTTP endpoint to query the API directly and drive workflows programmatically.
How to install
Prerequisites:\n- Node.js v18+ and npm installed on your machine or server.\n- Access to a Google account for NotebookLM automation.\n- Optional: Docker for containerized deployment.\n\nInstall locally (Node.js MCP mode):\n1) Clone the repository and install dependencies:\n git clone https://github.com/roomi-fields/notebooklm-mcp.git\n cd notebooklm-mcp\n npm install\n\n2) Build the project (if required by the project setup):\n npm run build\n\n3) Run/setup authentication (if using HTTP API):\n npm run setup-auth\n npm run start:http # Start HTTP REST API on port 3000 (optional)\n\n4) Run the MCP server (Node.js mode):\n node dist/index.js # Or your built entrypoint path if different\n\nDocker (optional):\n1) Build the image: docker build -t notebooklm-mcp .\n2) Run: docker run -d --name notebooklm-mcp -p 3000:3000 -p 6080:6080 -v notebooklm-data:/data notebooklm-mcp
Additional notes
Tips and notes:\n- The MCP server demonstrates two interaction modes: MCP (via Claude Code/Cursor/Codex or MCP clients) and HTTP REST API for automation pipelines.\n- When using MCP in environments without a GUI, prefer the MCP mode (Claude Code/Cursor) because it authenticates via Google login in a browser window. For headless setups, use the HTTP API and ensure proper OAuth setup.\n- Environment variables may include GOOGLE_ACCOUNT_FOR_AUTOMATION and GOOGLE_APPLICATION_CREDENTIALS; ensure credentials are valid and have NotebookLM access.\n- If you deploy with Docker, expose port 3000 for the API and 6080 for noVNC-based Google login if using the VNC workflow.\n- The official notebooklm-mcp repository includes documentation for installation, configuration, API usage, and integration guides.\n- For best results, keep dependencies up to date and refer to the deployment docs for NAS/Docker guidance.
Related MCP Servers
n8n
A MCP for Claude Desktop / Claude Code / Windsurf / Cursor to build n8n workflows for you
browser-use -client
A MCP client for browser-use
autonomo
Tired of 'it works' lies? Autonomo MCP makes your AI prove it—on real hardware, right in your editor.
prometheus
A Model Context Protocol (MCP) server implementation that provides AI agents with programmatic access to Prometheus metrics via a unified interface.
architect
A powerful, self-extending MCP server for dynamic AI tool orchestration. Features sandboxed JS execution, capability-based security, automated rate limiting, marketplace integration, and a built-in monitoring dashboard. Built for the Model Context Protocol (MCP).
ai-suite
AI-Suite - n8n, Open WebUI, OpenCode, Llama.cpp/Ollama, Flowise, Langfuse, MCP Gateway and more!