flexible-graphrag
Flexible GraphRAG: Python, LlamaIndex, Docker Compose: 8 Graph dbs, 10 Vector dbs, OpenSearch, Elasticsearch, Alfresco. 13 data sources (9 auto-sync), KG auto-building, schemas, LLMs, Docling or LlamaParse doc processing, GraphRAG, RAG only, Hybrid search, AI chat. React, Vue, Angular frontends, FastAPI backend, REST API, MCP Server. Please 🌟 Star
claude mcp add --transport stdio stevereiner-flexible-graphrag uvx flexible-graphrag-mcp \ --env LOG_LEVEL="INFO (default) or DEBUG" \ --env DATABASE_URL="Connection string for backend databases (if required)" \ --env FGR_MCP_API_BASE_URL="URL base for MCP REST API (if applicable)"
How to use
Flexible GraphRAG includes an MCP server that exposes a set of tools for working with documents, data sources, and AI-powered queries via Claude Desktop MCP clients and other MCP-enabled tools. The MCP server provides full API parity for its available actions, including ingesting documents from all 13 data sources, performing hybrid searches across vector, full-text, and graph databases, and conducting AI queries and chats. It supports both HTTP mode for debugging and stdio mode for production, enabling flexible integration into your existing pipelines. The MCP client tools cover ingest_documents(), ingest_text(), search_documents(), query_documents(), system diagnostics, and health checks, along with specialized tools for processing and management. You can choose between running the MCP server directly via uvx (no-install) or installing via pipx for a persistent environment. The server works in tandem with the FastAPI backend and the UI clients (Angular, React, Vue) that consume the REST APIs provided by the FastAPI layer.
How to install
Prerequisites:
- Python 3.8+ (for MCP server components if using Python-based execution)
- uvx (no-install execution) or pipx (installation-based)
- Optional: Docker for modular deployment (docker-compose includes vector, graph, search databases, and Alfresco)
Installation steps (choose one execution path):
Option A: No-install (uvx)
- Ensure uvx is installed on your system.
- Run the MCP server in no-install mode: uvx flexible-graphrag-mcp
- Verify the server starts and exposes the MCP endpoints at the configured HTTP transport or stdio channel.
Option B: Persistent environment (pipx)
- Install the MCP package with pipx: pipx install flexible-graphrag-mcp
- Run the MCP server via pipx: pipx run flexible-graphrag-mcp
- Confirm the server is up and listening on the expected ports.
Option C: Docker deployment (recommended for modular databases and quick start)
- Navigate to the docker/ directory and start the stack with docker-compose: cd docker docker-compose up -d
- The MCP server will be available through the configured API endpoints while databases (vector, graph, search) are brought up via the compose file.
Notes:
- The MCP server supports both HTTP mode for debugging and stdio mode for production; choose the invocation mode that fits your workflow.
- If you are using Docker, you can enable or disable components (vector, graph, search engines, Alfresco) with single-line comments in the docker-compose file.
Additional notes
Tips and common issues:
- Environment variables: Set FGR_MCP_API_BASE_URL to your MCP REST API base URL if using HTTP transport; adjust LOG_LEVEL for troubleshooting; provide DATABASE_URL if your backend requires a separate connection string.
- Data sources: The MCP client tools support all 13 data sources. Use ingest_documents() with source-specific configs (filesystem, Alfresco, SharePoint, Box, CMIS, cloud storage, web, etc.). The skip_graph flag can disable GraphRAG processing for a given ingestion.
- Ingestion parameters: You can pass paths for filesystem/Alfresco/CMIS and, for Alfresco, nodeDetails for multi-select KG Spaces.
- Diagnostics: Use system diagnostics and health checks frequently when deploying to new environments or when you add new data sources or databases.
- Deployment mode: If you’re aiming for rapid experimentation, start with the HTTP/stdio dual transport in development, then move to a full Docker deployment for production stability.
Related MCP Servers
ai-guide
程序员鱼皮的 AI 资源大全 + Vibe Coding 零基础教程,分享大模型选择指南(DeepSeek / GPT / Gemini / Claude)、最新 AI 资讯、Prompt 提示词大全、AI 知识百科(RAG / MCP / A2A)、AI 编程教程、AI 工具用法(Cursor / Claude Code / OpenClaw / TRAE / Lovable / Agent Skills)、AI 开发框架教程(Spring AI / LangChain)、AI 产品变现指南,帮你快速掌握 AI 技术,走在时代前沿。本项目为开源文档版本,已升级为鱼皮 AI 导航网站
koog
Koog is the official Kotlin framework for building predictable, fault-tolerant and enterprise-ready AI agents across all platforms – from backend services to Android and iOS, JVM, and even in-browser environments. Koog is based on our AI products expertise and provides proven solutions for complex LLM and AI problems
haiku.rag
Opinionated agentic RAG powered by LanceDB, Pydantic AI, and Docling
MCP2Lambda
Run any AWS Lambda function as a Large Language Model (LLM) tool without code changes using Anthropic's Model Context Protocol (MCP).
oura
Oura Ring Model Controller Protocol (MCP).
mcp-tools
Tools for MCP (Model Context Protocol) written in Go