c4-genai-suite
c4 GenAI Suite
claude mcp add --transport stdio codecentric-c4-genai-suite docker run -i codecentric/c4-genai-suite:latest \ --env MCP_PORT="8080" \ --env REIS_HOST="reis" \ --env REIS_PORT="8000" \ --env MCP_BASE_URL="http://localhost:8080" \ --env MCP_LOG_LEVEL="info"
How to use
To use the MCP server, first ensure it is running in your environment (for example via Docker as shown in the installation steps). Once running, you can configure assistants to use this MCP server as an extension. In the admin area of the c4 GenAI Suite, choose an assistant, add an extension, and select the MCP server as the tool provider. You can define which MCP endpoints and capabilities are available to the assistant, such as supplying a custom system prompt, enabling retrieval-augmented generation features, or plugging in additional MCP-based extensions. The integration is designed to be modular: you can enable or disable individual extensions, swap model providers (OpenAI, Azure OpenAI, Bedrock, Ollama, etc.), and configure credentials per model. When you initiate a chat, the assistant will query the MCP server to determine the available capabilities and route prompts and tooling calls through the appropriate MCP endpoints. If you use the REI-S service for RAG, ensure its endpoint is accessible (as configured in the environment) so document/file content can be indexed and retrieved for context.
How to install
Prerequisites:
- Docker and Docker Compose installed on your machine or in your deployment environment
- Basic familiarity with running containerized services
- Access to a supported MCP extension repository or image (for this server, we assume the image codecentric/c4-genai-suite:latest)
Installation steps:
-
Clone the repository (if you are deploying from source): git clone https://github.com/codecentric/c4-genai-suite cd c4-genai-suite
-
Pull and run the MCP server image via Docker: docker pull codecentric/c4-genai-suite:latest docker run -d --name c4-genai-mcp
-e MCP_PORT=8080
-e MCP_LOG_LEVEL=info
-e MCP_BASE_URL=http://localhost:8080
codecentric/c4-genai-suite:latest -
Verify the service is running by checking logs or accessing the base URL: docker logs -f c4-genai-mcp curl http://localhost:8080/health
-
Optional: If orchestrating with docker-compose, create a docker-compose.yml that defines the MCP server service and any dependencies (frontend/backend/REI-S). Then start all services: docker-compose up -d
-
Configure the MCP server as an extension in the c4 GenAI Suite admin UI:
- Open the admin area
- Select the desired assistant
- Add Extension -> choose MCP server -> configure endpoints and capabilities
-
Ensure networking and environment variables align with your deployment (MCP_BASE_URL, REIS_HOST/PORT, database connections for the backend, etc.).
Additional notes
Environment variables and configuration options vary by deployment. Common variables include MCP_PORT, MCP_BASE_URL, and credentials for LLM providers. If you encounter connectivity issues with REI-S, verify that REI-S is reachable at the configured host/port and that the vector store (pgvector or Azure AI Search) endpoints are accessible. When running locally with Docker, the default port is 8080; adjust your frontend/backend configuration if you map this port elsewhere. If you plan to scale, consider separating the MCP server from the REI-S service and ensure proper authentication and network policies between components. For debugging, enable MCP_LOG_LEVEL=debug to get more verbose logs.
Known issues:
- Some MCP extensions may require additional permissions or credentials for certain LLM providers.
- Ensure the REI-S service is up before enabling RAG-backed extensions to avoid empty search results.
Related MCP Servers
repomix
📦 Repomix is a powerful tool that packs your entire repository into a single, AI-friendly file. Perfect for when you need to feed your codebase to Large Language Models (LLMs) or other AI tools like Claude, ChatGPT, DeepSeek, Perplexity, Gemini, Gemma, Llama, Grok, and more.
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
ai-guide
程序员鱼皮的 AI 资源大全 + Vibe Coding 零基础教程,分享大模型选择指南(DeepSeek / GPT / Gemini / Claude)、最新 AI 资讯、Prompt 提示词大全、AI 知识百科(RAG / MCP / A2A)、AI 编程教程、AI 工具用法(Cursor / Claude Code / OpenClaw / TRAE / Lovable / Agent Skills)、AI 开发框架教程(Spring AI / LangChain)、AI 产品变现指南,帮你快速掌握 AI 技术,走在时代前沿。本项目为开源文档版本,已升级为鱼皮 AI 导航网站
archestra
Secure cloud-native MCP registry, gateway & orchestrator
papersgpt-for-zotero
A powerful Zotero AI and MCP plugin with ChatGPT, Gemini 3.1, Claude, Grok, DeepSeek, OpenRouter, Kimi 2.5, GLM 5, SiliconFlow, GPT-oss, Gemma 3, Qwen 3.5
py-gpt
Desktop AI Assistant powered by GPT-5, GPT-4, o1, o3, Gemini, Claude, Ollama, DeepSeek, Perplexity, Grok, Bielik, chat, vision, voice, RAG, image and video generation, agents, tools, MCP, plugins, speech synthesis and recognition, web search, memory, presets, assistants,and more. Linux, Windows, Mac