Connapse
Open-source AI-powered knowledge management platform for AI agents. Transform documents into searchable knowledge with hybrid vector + keyword search. Built with .NET 10 Blazor.
claude mcp add --transport stdio destrayon-connapse docker run -i destrayon/connapse-mcp:latest
How to use
Connapse exposes an MCP server for Claude Desktop integration, enabling you to drive knowledge management actions via a conversational AI interface. The MCP server exposes seven tools to manage containers (projects) and documents: container_create, container_list, container_delete, upload_file, list_files, delete_file, and search_knowledge. After configuring Claude Desktop to point to your Connapse instance and providing the agent's API key, you can invoke these tools through the MCP protocol to create projects, upload files, search across your documents, and retrieve structured results. This makes it easy to plug Connapse’s knowledge base into AI-assisted workflows without leaving your AI environment.
To use the MCP server effectively, set up your Connapse instance (via Docker Compose as shown in the Quick Start) and obtain an agent API key from the admin UI. Then configure Claude Desktop (or any MCP consumer) to direct requests to the MCP server endpoint and pass the agent API key in the X-Api-Key header. The tools will handle actions such as creating a container, uploading files to a container, listing and deleting files, and performing searches against your indexed documents. The server is designed to work with the same authentication and access control model as the main Connapse app, ensuring your agent access respects roles and permissions.
How to install
Prerequisites:
- Docker and Docker Compose installed on your machine
- Optional: .NET 10 SDK if you want to run the web app locally or contribute to the code
Install and run (docker-based quick start):
- Clone the repository and navigate to it (as per the project docs).
- Set required environment variables for admin access (examples shown; replace with real values): export CONNAPSE_ADMIN_EMAIL=admin@example.com export CONNAPSE_ADMIN_PASSWORD=YourSecurePassword123! export Identity__Jwt__Secret=$(openssl rand -base64 64)
- Start the services using Docker Compose (in the Quick Start you’ll find a full setup): docker-compose up -d
- Open the web UI at http://localhost:5001 and log in with the admin credentials you set.
Running the MCP server for Claude Desktop (Docker):
- The MCP server is dockerized and can be launched alongside your Connapse stack. Use the following to run the MCP server container (as demonstrated in the mcp_config): docker run -i destrayon/connapse-mcp:latest
If you prefer building from source or running locally, follow the project’s standard development workflow for .NET apps (dotnet run in the appropriate project), and ensure the MCP server is accessible at its configured endpoint for Claude Desktop integration.
Additional notes
Tips and common considerations:
- Ensure your environment variables for admin access and security (Identity__Jwt__Secret) are properly set in production.
- The MCP server requires an agent API key from Connapse; store this securely and pass it via X-Api-Key when configuring Claude Desktop.
- If you run into connectivity issues, verify that the MCP server endpoint is reachable from Claude Desktop and that firewalls allow the necessary traffic.
- For production deployments, use a persistent volume for PostgreSQL/MinIO in the main Connapse stack and ensure proper backup strategies.
- The MCP tools mirror the core knowledge management capabilities: you can automate container lifecycle and file indexing through conversational prompts in Claude Desktop.
Related MCP Servers
basic-memory
AI conversations that actually remember. Never re-explain your project to your AI again. Join our Discord: https://discord.gg/tyvKNccgqN
UE5
MCP for Unreal Engine 5
ai
Demonstrate and deliver AI features by combining all AI Building Blocks into a unified WordPress experience.
zerodha
Zerodha MCP Server & Client - AI Agent (w/Agno & w/Google ADK)
coco-search
Local-first hybrid semantic code search tool. Indexes codebases into PostgreSQL with pgvector embeddings via Ollama, combines vector similarity + keyword search with RRF fusion. Supports 30+ languages. Features CLI, MCP server, WEB dashboard and interactive REPL.
opencode-personal-knowledge
🧠 Personal knowledge MCP server with vector database for Opencode. Store and retrieve knowledge using semantic search, powered by local embeddings.