adk -rag
A Retrieval-Augmented Generation (RAG) system that leverages Google's Agent Development Kit (ADK) and Qdrant vector database via MCP server.
claude mcp add --transport stdio khoi03-adk-mcp-rag python main.py \ --env QDRANT_URL="http://qdrant:6333" \ --env FASTMCP_HOST="0.0.0.0" \ --env FASTMCP_PORT="8888" \ --env NETWORK_NAME="mcp-servers" \ --env GOOGLE_API_KEY="YOUR_VALUE_HERE" \ --env OPENAI_API_KEY="YOUR_VALUE_HERE" \ --env COLLECTION_NAME="default_collection" \ --env ANTHROPIC_API_KEY="YOUR_VALUE_HERE" \ --env QDRANT_SEARCH_LIMIT="3" \ --env QDRANT_CONTAINER_NAME="qdrant-mcp" \ --env QDRANT_EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2"
How to use
This MCP server implements a Retrieval-Augmented Generation workflow that uses Google's ADK for LLM capabilities and a Qdrant vector store accessed via the MCP interface. The server ingests documents, creates embeddings, and stores them in Qdrant, enabling semantic search to augment LLM responses with relevant context. The built-in ADK-UI offers tracing, testing, and debugging capabilities to help you observe tool interactions and agent behavior as you issue queries. To use it, start the Qdrant MCP stack (via Docker Compose) and run the Python-based MCP server entrypoint (main.py). Then you can push documents, test vector retrieval, and issue queries to obtain context-enriched answers from your LLM.
How to install
Prerequisites:
- Docker and Docker Compose installed for Qdrant MCP stack
- Python 3.8+ and a Python virtual environment tool (uv as per project guidance)
- Access to required API keys (Google, OpenAI, Anthropic, etc.)
Installation steps:
- Clone the repository
git clone https://github.com/khoi03/adk-mcp-rag.git
cd adk-mcp-rag
- Set up Python environment using uv (as recommended by the project)
uv sync
- Activate the virtual environment
macOS/Linux
source .venv/bin/activate
Windows
.venv\Scripts\activate
4) Install/update Python dependencies
```bash
uv add -r requirements.txt
- Configure environment variables for MCP server (see docker/.env.example). Create and edit a .env file under docker with keys:
GOOGLE_API_KEY=YOUR_VALUE_HERE
OPENAI_API_KEY=YOUR_VALUE_HERE
ANTHROPIC_API_KEY=YOUR_VALUE_HERE
NETWORK_NAME=mcp-servers
QDRANT_CONTAINER_NAME=qdrant-mcp
QDRANT_URL=http://qdrant:6333
QRANT_MCP_SSE=http://localhost:8888/sse
COLLECTION_NAME=default_collection
QDRANT_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2
QDRANT_SEARCH_LIMIT=3
FASTMCP_HOST=0.0.0.0
FASTMCP_PORT=8888
- Build and start the Qdrant MCP stack using Docker Compose
# Build and start services
docker compose -p qdrant-mcp up --build -d
# Check running services
docker compose ps
# View logs
docker compose logs -f
- Ingest documents and run the MCP server
# Document ingestion (local data to Qdrant)
python local_vector_store/prepare_corpus_and_data_locally.py
# Start the MCP server (uses main.py as entrypoint)
python main.py
- Optional: access the built-in ADK UI for debugging and tracing
adk web
Additional notes
Tips and common issues:
- Ensure the .env variables match your deployment (especially API keys and Qdrant endpoints).
- If Qdrant fails to start, verify that the Docker network name and container names are accessible from the MCP host.
- The MCP server relies on the Qdrant embeddings model specified (QDRANT_EMBEDDING_MODEL). If you change models, re-embed and re-ingest data.
- For large corpora, adjust QDRANT_SEARCH_LIMIT to balance latency and retrieval quality.
- When using ADK-UI, ensure the ADK components are reachable from your environment; check ADK version compatibility with your LLM backend.
- If you run into environment variable issues in uv, confirm you are inside the virtual environment before running python commands.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.
gradio -hackathon
Our participation to the 2025 Gradio Agent MCP Hackathon
MIST
MCP server empowering AI assistants with real-world capabilities: Gmail, Calendar, Tasks, Git integration, and note management. Bridges AI assistants to external services through standardized protocol with secure authentication.
repo-stargazer
Talk to your starred github repositories
pearl_mcp_server
A Model Context Protocol (MCP) server implementation that exposes Pearl's AI and Expert services through a standardized interface