Get the FREE Ultimate OpenClaw Setup Guide →

docs

Grounded Docs MCP Server: Open-Source Alternative to Context7, Nia, and Ref.Tools

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio arabold-docs-mcp-server npx -y @arabold/docs-mcp-server@latest \
  --env OPENAI_API_KEY="optional" \
  --env DOCS_MCP_DATA_DIR="path to data directory (optional)"

How to use

Grounded Docs MCP Server provides a private, up-to-date documentation index for your AI coding assistant. It sources official docs from websites, GitHub repositories, npm, PyPI, and local files, enabling your AI to query the exact library versions you’re using. The server exposes an API and web UI to index, search, and retrieve relevant documentation content, and it supports multiple embedding strategies to improve semantic search quality. Typical use cases include giving your AI precise API references, code examples, and library behavior notes in the exact context of your project, reducing hallucinations and ensuring consistency with your installed dependencies.

To get started, install and run the server (via npx) and then connect your MCP client to the provided SSE endpoint. You can add docs-mcp-server as an MCP source in your client configuration, pointing to http://localhost:6280/sse. The server can index local folders or zip archives, and it supports embedding models to boost search quality. If you prefer Docker, you can run the official image to quickly stand up a containerized instance and expose port 6280 for client connections.

How to install

Prerequisites:

  • Node.js v22+ installed locally (for npx setup)
  • npm (comes with Node.js) or corepack enabled
  • Optional: Docker if you prefer running via container

Install and run with npx:

  1. Ensure Node.js and npm are available: node -v npm -v

  2. Run the Docs MCP Server via npx (auto-installs the latest package): npx -y @arabold/docs-mcp-server@latest

  3. By default, the server will start and listen on port 6280. Open the web UI at http://localhost:6280 to add documentation and configure sources.

Optional: Run with Docker (alternative):

  1. Pull and run the official image: docker run --rm
    -v docs-mcp-data:/data
    -v docs-mcp-config:/config
    -p 6280:6280
    ghcr.io/arabold/docs-mcp-server:latest
    --protocol http --host 0.0.0.0 --port 6280

  2. Access the UI at http://localhost:6280 and configure indexing sources.

Additional notes

Tips and common issues:

  • If you’re indexing private or internal docs, consider using embedding models for faster and more accurate search results.
  • Ensure your environment has access to the sources (firewalls or VPNs may block remote docs).
  • For best results, provide a mix of sources (web docs, GitHub repos, and local files) to cover API references and usage examples.
  • When using embeddings, monitor API usage and latency, especially if you’re routing through a provider like OpenAI.
  • If you upgrade the server version, reindexing may be helpful to recompute embeddings with newer models or parsers.
  • Environment variables: OPENAI_API_KEY can be provided to enable OpenAI embeddings; DOCS_MCP_DATA_DIR or similar can be used to customize data storage.

Related MCP Servers

Sponsor this space

Reach thousands of developers