ApeRAG
ApeRAG: Production-ready GraphRAG with multi-modal indexing, AI agents, MCP support, and scalable K8s deployment
claude mcp add --transport stdio apecloud-aperag docker compose up -d --pull always \ --env APERAG_API_KEY="your-api-key"
How to use
ApeRAG is a production-ready retrieval-augmented generation (RAG) platform that combines hybrid retrieval (vector search, full-text search, and graph-based querying) with intelligent AI agents for enterprise knowledge management. It exposes an MCP (Model Context Protocol) integration that lets AI assistants interact with your knowledge base directly, browse collections, perform hybrid searches, and query documents using natural language. Once started via Docker Compose, you can access the web UI and API documentation to manage data, configure AI agents, and test MCP-powered interactions. The included MCP support means you can point your MCP client at the ApeRAG MCP endpoint and authorize requests using an API key or an environment token, enabling seamless tool access and knowledge retrieval during agent reasoning. You can also leverage the MinerU-based advanced document parsing service for enhanced parsing of complex documents, which ApeRAG can orchestrate alongside your retrieval pipelines.
How to install
Prerequisites:
- Docker and Docker Compose installed on your machine
- Git installed
- Optional: an API key for ApeRAG if you want to enable MCP access
Step 1: Clone the repository
git clone https://github.com/apecloud/ApeRAG.git
cd ApeRAG
Step 2: Prepare environment
- Copy the example environment file if provided and customize values (for example, API keys or service endpoints).
cp envs/env.template .env
Step 3: Start ApeRAG with Docker Compose
docker compose up -d --pull always
This will pull and start the required services (web UI, API server, databases, etc.). Access the UI and docs after the containers are healthy.
Step 4: Verify MCP and endpoints
- Web Interface: http://localhost:3000/web/
- API Documentation: http://localhost:8000/docs
Step 5: Optional MCP configuration If you plan to use MCP, create or update your MCP client configuration to point at the ApeRAG MCP endpoint (e.g., https://rag.apecloud.com/mcp/) and provide the API key via Authorization header or environment variable (APERAG_API_KEY).
{
"mcpServers": {
"aperag-mcp": {
"url": "https://rag.apecloud.com/mcp/",
"headers": {
"Authorization": "Bearer your-api-key-here"
}
}
}
}
Step 6: Stop and cleanup (optional)
docker compose down
Additional notes
Tips and common issues:
- Ensure Docker and Docker Compose versions are compatible with the repository's configuration. The Quick Start assumes docker compose v2 syntax.
- If you use a private API key for MCP, store it securely (e.g., via environment variables or a secrets manager) and do not commit keys to version control.
- The MCP endpoint URL in your configuration should be the full public URL of ApeRAG's MCP service; replace placeholder URLs with your actual deployment URL.
- The web UI runs on port 3000 and the API docs on port 8000; make sure those ports are accessible in your environment.
- If you enable MinerU-based parsing, you may need to start additional services or profiles (docray) as described in the Enhanced Document Parsing section of the docs.
- For production Kubernetes deployment, follow the Helm chart guidance under Kubernetes Deployment to achieve HA and scalability.
Related MCP Servers
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
Continuous-Claude-v3
Context management for Claude Code. Hooks maintain state via ledgers and handoffs. MCP execution without context pollution. Agent orchestration with isolated context windows.
NagaAgent
A simple yet powerful agent framework for personal assistants, designed to enable intelligent interaction, multi-agent collaboration, and seamless tool integration.
skillport
Bring Agent Skills to Any AI Agent and Coding Agent — via CLI or MCP. Manage once, serve anywhere.
cocoindex-code
A super light-weight embedded code mcp (AST based) that just works - saves 70% token and improves speed for coding agent. 🌟 Star if you like it!
dat
Asking yours data in a natural language way through pre-modeling (data models and semantic models).