repocks
Turn your repository into local RAG / MCPServer.
claude mcp add --transport stdio boke0-repocks repocks start \ --env OLLAMA_LLM="default (e.g., llama2:13b; override via env to select model)" \ --env OLLAMA_EMBEDDING_MODEL="default embedding model (override via env)"
How to use
Repocks turns your collection of Markdown documents into a searchable knowledge base with AI-powered Q&A. The server indexes your Markdown files locally and exposes an MCP endpoint that can be queried by MCP clients like Claude Desktop, Cline, or other compatible tools. Use repocks index to build or refresh the index, and repocks start to run the MCP server so that connected clients can ask questions about your documentation. The included Ollama integration lets you pick different language models and embedding models via environment variables, enabling you to tailor performance and accuracy to your data. To integrate with Claude Desktop, configure an MCP server entry that points to the repocks start command so Claude can forward questions and receive answers from your local knowledge base.
How to install
Prerequisites:
- Node.js 20.9.0 or higher
- Ollama installed and running locally
- npm, yarn, or pnpm available
Installation steps:
- Install Repocks globally
npm install -g repocks
- Ensure Ollama models are downloaded (example):
ollama pull qwen3:4b
ollama pull mxbai-embed-large
- (Optional) Start Ollama locally to verify it's running:
ollama serve
- Install any additional dependencies or tooling as needed by your environment.
- Initialize and run the MCP workflow:
repocks index
repocks start
- (Optional) Add Claude Desktop integration by configuring an MCP entry that points to the start command, if you plan to use Claude Desktop or other MCP clients. Example MCP entry:
{
"mcpServers": {
"repocks": {
"command": "repocks",
"args": ["start"]
}
}
}
Additional notes
Tips and common considerations:
- By default Repocks indexes ~/.repocks//*.md and ./docs//*.md. To customize, create repocks.config.json with a targets array.
- You can switch AI models by setting environment variables: OLLAMA_LLM for the language model and OLLAMA_EMBEDDING_MODEL for embeddings. You can also point to a remote Ollama instance with OLLAMA_BASE_URL.
- If you see 'No documents found', verify your repocks.config.json paths and ensure MD files exist in the specified locations, then re-run repocks index.
- When running locally, ensure Ollama is accessible from the same host and port configured by your environment.
- To integrate with Claude Desktop, add the MCP configuration under Developer > Model Context Protocol with the repocks start command, as shown in the README example.
Related MCP Servers
obsidian -tools
Add Obsidian integrations like semantic search and custom Templater prompts to Claude or any MCP client.
Mantic.sh
A structural code search engine for Al agents.
pluggedin-app
The Crossroads for AI Data Exchanges. A unified, self-hostable web interface for discovering, configuring, and managing Model Context Protocol (MCP) servers—bringing together AI tools, workspaces, prompts, and logs from multiple MCP sources (Claude, Cursor, etc.) under one roof.
furi
CLI & API for MCP management
mcp -arangodb
This is a TypeScript-based MCP server that provides database interaction capabilities through ArangoDB. It implements core database operations and allows seamless integration with ArangoDB through MCP tools. You can use it wih Claude app and also extension for VSCode that works with mcp like Cline!
mcp-rag
mcp-rag-server is a Model Context Protocol (MCP) server that enables Retrieval Augmented Generation (RAG) capabilities. It empowers Large Language Models (LLMs) to answer questions based on your document content by indexing and retrieving relevant information efficiently.