Get the FREE Ultimate OpenClaw Setup Guide →

notebooklm

MCP server for Google NotebookLM — 32 tools for notebooks, sources, research, and studio content generation

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio m4yk3ldev-notebooklm-mcp npx -y @m4ykeldev/notebooklm-mcp serve

How to use

NotebookLM MCP Server exposes a local MCP-compliant interface that bridges Google NotebookLM with your AI tools, IDEs, and terminals. It allows you to manage notebooks, ingest sources (URLs, transcripts, Drive documents, or raw text), and run autonomous research tasks or content generation workflows directly from your preferred assistant or developer environment. The server exposes a stable set of tools you can invoke from an AI assistant (like Claude or Cursor) or from your terminal, with commands such as notebook_list, notebook_create, notebook_add_url, research_start, and studio generation tools like audio_overview_create and slide_deck_create. To get started, install or run the MCP server via npx and authenticate using the built-in one-click login, which stores session data locally for continued access.

Once running, you can configure your AI agents to call the provided tools and reference specific notebooks and sources. For example, you can ask your assistant to create a new NotebookLM project, ingest a set of URLs, then launch a research task and import findings back into the notebook. The Studio tools enable generating podcast-style audio overviews, briefing docs, infographics, slide decks, and quizzes directly from your notebook’s sources, all within the MCP framework.

To integrate with Claude Desktop, Claude Code, Cursor, or VS Code, add a notebooklm server entry to your mcpServers configuration and point it to the serve command provided by the MCP package. This keeps your workflow consistent across tools and platforms while leveraging NotebookLM’s deep research and content-generation capabilities.

How to install

Prerequisites:

  • Node.js (recommended LTS) and npm installed on your machine
  • Internet access to fetch the MCP package from npm
  1. Install Node.js and npm if they are not already installed. You can download from https://nodejs.org/ or use your system package manager.

  2. Install the NotebookLM MCP package globally (optional but recommended for performance):

npm install -g @m4ykeldev/notebooklm-mcp
  1. Run the MCP server with npx (as documented in the Quick Start):
  1. Optionally configure your AI tools or editors to point to the MCP server using the provided mcpServers configuration snippet (see how_to_use). The server handles authentication via a one-click flow and stores session data locally.

  2. If you prefer to install locally instead of using npx/global install, you can run the package via npm scripts or your project’s setup, ensuring that node_modules/.bin is in your PATH.

Additional notes

Tips and considerations:

  • Authentication flow uses a secure login window; your session data is stored locally (e.g., ~/.notebooklm-mcp/auth.json).
  • If the session expires, the server will transparently attempt to reconnect without interrupting your workflow.
  • You can customize timeouts for large sources or long-running queries using options like --query-timeout when starting the server.
  • Ensure your environment has network access to Google NotebookLM endpoints for ingestion and authentication.
  • The MCP setup is designed to work with major AI assistants and IDEs; for integration, use the provided mcpServers snippet and the npm package name shown above.
  • This project is an independent community effort and may rely on internal NotebookLM endpoints that could change; keep the MCP package updated to stay aligned with platform changes.

Related MCP Servers

Sponsor this space

Reach thousands of developers