Get the FREE Ultimate OpenClaw Setup Guide →

qurio

Self-hosted RAG engine for AI coding assistants. Ingests technical docs & code repositories locally with structure-aware chunking. Serves grounded context via MCP to prevent hallucinations in software development workflows.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio irahardianto-qurio docker compose up -d

How to use

Qurio is a self-hosted, open-source ingestion and retrieval engine that serves as a local Shared Library for AI coding assistants via the MCP (Model Context Protocol). It runs as a microservices stack orchestrated by Docker Compose, exposing a stateless MCP endpoint that your MCP-enabled editors and agents can connect to. With Qurio, you can ingest heterogeneous documentation (web pages, PDFs, Markdown), enable hybrid search with BM25 and vector embeddings, and optionally rerank results using providers like JinaAI or Cohere. The MCP endpoint is straightforward to consume: connect your agent to http://localhost:8081/mcp and start issuing JSON-RPC 2.0 style requests to access your indexed knowledge. Available tools include qurio_search for searching the knowledge base, qurio_list_sources to enumerate data sources, qurio_list_pages to explore documents, and qurio_read_page to fetch full content of a document. This setup enables your AI agents to query precise, trusted documentation locally, reducing privacy risk and latency.

How to install

Prerequisites:

  • Docker and Docker Compose installed on your machine
  • A Google Gemini API Key for embeddings (if you plan to use vector search)

Installation steps:

  1. Clone the repository and enter the project directory: git clone https://github.com/irahardianto/qurio.git cd qurio

  2. Configure environment: Copy the example environment file and add your Gemini API key (and any other needed keys): cp .env.example .env

    Edit .env to set GEMINI_API_KEY and other variables as needed

  3. Start the system: docker-compose up -d

    This will start Weaviate, PostgreSQL, Go backend, Python ingestion worker, and Vue frontend via Docker Compose. Give it a minute to initialize all services.

  4. Access the dashboard: Open http://localhost:3000 in your browser to manage sources and settings.

  5. Add API keys (optional but recommended): In the dashboard, go to Settings and add your Gemini (embeddings) key, and optionally JinaAI or Cohere keys for reranking.

Optional:

  • If you modify environment variables, restart services with: docker-compose restart
  • To view logs: docker-compose logs -f

Additional notes

Tips and common issues:

  • Ensure Docker Desktop is running and that port 3000 (UI) and 8081 (MCP endpoint) are accessible on localhost.
  • If the UI reports missing services, check docker-compose ps to verify all containers are running and inspect logs with docker-compose logs.
  • The MCP endpoint is stateless and supports streaming HTTP transport. Use a client that supports native MCP connections over HTTP.
  • For offline use, ensure data stores (Weaviate, PostgreSQL) initialize properly; give the system a couple of minutes after docker-compose up for full readiness.
  • Environment variables in .env control embeddings and reranking; verify GEMINI_API_KEY and RERANK_PROVIDER values before starting.
  • The available MCP tools (qurio_search, qurio_list_sources, qurio_list_pages, qurio_read_page) enable you to perform end-to-end data discovery and retrieval from your indexed corpus.

Related MCP Servers

Sponsor this space

Reach thousands of developers