Get the FREE Ultimate OpenClaw Setup Guide →

repocks

Turn your repository into local RAG / MCPServer.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio boke0-repocks repocks start \
  --env OLLAMA_LLM="default (e.g., llama2:13b; override via env to select model)" \
  --env OLLAMA_EMBEDDING_MODEL="default embedding model (override via env)"

How to use

Repocks turns your collection of Markdown documents into a searchable knowledge base with AI-powered Q&A. The server indexes your Markdown files locally and exposes an MCP endpoint that can be queried by MCP clients like Claude Desktop, Cline, or other compatible tools. Use repocks index to build or refresh the index, and repocks start to run the MCP server so that connected clients can ask questions about your documentation. The included Ollama integration lets you pick different language models and embedding models via environment variables, enabling you to tailor performance and accuracy to your data. To integrate with Claude Desktop, configure an MCP server entry that points to the repocks start command so Claude can forward questions and receive answers from your local knowledge base.

How to install

Prerequisites:

  • Node.js 20.9.0 or higher
  • Ollama installed and running locally
  • npm, yarn, or pnpm available

Installation steps:

  1. Install Repocks globally
npm install -g repocks
  1. Ensure Ollama models are downloaded (example):
ollama pull qwen3:4b
ollama pull mxbai-embed-large
  1. (Optional) Start Ollama locally to verify it's running:
ollama serve
  1. Install any additional dependencies or tooling as needed by your environment.
  2. Initialize and run the MCP workflow:
repocks index
repocks start
  1. (Optional) Add Claude Desktop integration by configuring an MCP entry that points to the start command, if you plan to use Claude Desktop or other MCP clients. Example MCP entry:
{
  "mcpServers": {
    "repocks": {
      "command": "repocks",
      "args": ["start"]
    }
  }
}

Additional notes

Tips and common considerations:

  • By default Repocks indexes ~/.repocks//*.md and ./docs//*.md. To customize, create repocks.config.json with a targets array.
  • You can switch AI models by setting environment variables: OLLAMA_LLM for the language model and OLLAMA_EMBEDDING_MODEL for embeddings. You can also point to a remote Ollama instance with OLLAMA_BASE_URL.
  • If you see 'No documents found', verify your repocks.config.json paths and ensure MD files exist in the specified locations, then re-run repocks index.
  • When running locally, ensure Ollama is accessible from the same host and port configured by your environment.
  • To integrate with Claude Desktop, add the MCP configuration under Developer > Model Context Protocol with the repocks start command, as shown in the README example.

Related MCP Servers

Sponsor this space

Reach thousands of developers