Get the FREE Ultimate OpenClaw Setup Guide →

docmole

Dig through any documentation with AI - MCP server for Claude, Cursor, and other AI assistants

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio vigtu-docmole bunx docmole serve --project my-docs

How to use

Docmole is an MCP server that lets AI assistants query any documentation site by indexing its content and optionally proxying Mintlify-powered sites. In Local RAG mode, Docmole crawls a documentation site, generates embeddings via OpenAI, stores them in LanceDB, and serves answers through the MCP interface. In Mintlify mode, Docmole acts as a proxy to Mintlify's AI Assistant API for instant, zero-setup results. Clients talk to Docmole through MCP, making it easy to integrate with Claude, Cursor, and other MCP-enabled tools. To use the Local RAG workflow, start the server (for example with bunx docmole serve --project my-docs) after configuring OPENAI_API_KEY. For Mintlify-mode, you provide a Mintlify project ID and Docmole forwards requests to Mintlify automatically.

Once running, you can query Docmole from your MCP client using a standard mcpServers entry, e.g. specify a server named my-docs with the bunx-based command that launches the Docmole server for your project. The server will handle embedding retrieval, semantic search, and keyword fallback, returning AI-assisted documentation results to your client.

How to install

Prerequisites:

  • Node.js or Bun runtime installed (Docmole is used via Bunx in the examples).
  • Access to OpenAI API if using Local RAG mode (OPENAI_API_KEY).
  • DNS/network access to your documentation site (for crawling) or Mintlify project access for Mintlify mode.

Installation steps:

  1. Install Bun if you haven't already (required for Docmole Local RAG mode):

  2. Install Docmole globally (via Bun) or use the interim Bunx invocation as shown in the docs:

    • bun install -g docmole
  3. Prepare environment variables (example for Local RAG mode):

    • OPENAI_API_KEY=your-openai-key
    • DOCMOLE_DATA_DIR=~/.docmole
  4. Start Docmole in Local RAG mode for a project (example):

  5. Connect your MCP client using the provided mcpServers entry (see mcp_config above) and deploy as part of your MCP gateway setup.

Additional notes

Tips and considerations:

  • The Local RAG mode requires an OPENAI_API_KEY and may incur API usage costs; ensure you monitor usage.
  • The LanceDB vector store is used for local embedding storage; ensure the DOCMOLE_DATA_DIR has sufficient space for your index.
  • Mintlify-mode is zero-setup but requires access to Mintlify project IDs; use the -p <project-id> option.
  • If your docs site uses dynamic content, ensure the crawler can access it and that the embeddings cover the necessary pages.
  • For MCP integration, you can expose multiple Docmole configurations by adding more entries under mcpServers with unique names and corresponding command-args pairs.
  • Environment variables like OPENAI_API_KEY are sensitive; consider using a secrets manager in production deployments.

Related MCP Servers

Sponsor this space

Reach thousands of developers