mcp_hub
Multi-user MCP gateway for Slack, Teamwork, and Telegram, enabling secure AI tool access through a single endpoint.
claude mcp add --transport stdio vangardo-mcp_hub docker run -i vangardo/mcp_hub
How to use
MCP Hub is a self-hosted AI operations gateway that provides a unified entrypoint for dozens of tools and integrations. It routes semantic queries to the most relevant tools instead of loading every schema into the AI context, enabling efficient, on-demand tool usage across 130+ tools in 12 integrations. Through the hub endpoints, you can search for tools, call them, and orchestrate multi-step flows using the built-in ReAct engine, memory, scheduler, and multi-user access. Connect clients like Claude Desktop, ChatGPT, Cursor, or Telegram to interact with the gateway and leverage semantic routing, persistent vector memory, and audit trails for all tool activity.
How to install
Prerequisites:
- Docker installed and running on your host
- Basic familiarity with containerized applications
- Internet access to pull the MCP Hub image
Installation steps:
- Pull and run the MCP Hub Docker container
# Pull and run the MCP Hub image (example image name)
docker pull vangardo/mcp_hub:latest
docker run -d --name mcp_hub -p 8000:8000 vangardo/mcp_hub:latest
- Verify the container is healthy
docker ps
- Configure environment variables (optional) If you need to set API keys, memory backends, or integration credentials, supply them via environment variables at runtime. Example placeholders:
-e MEMORY_STORAGE=sqlite
-e MEMORY_DB_PATH=/data/memory.db
-e HUB_API_KEY=your_api_key
- Optional: Persist data to a host directory
docker run -d --name mcp_hub -p 8000:8000 \
-v /path/to/data:/data \
vangardo/mcp_hub:latest
Notes:
- The MCP Hub image is designed to run in Docker as a single container with SQLite by default, requiring no external services for basic setups.
- If you need to scale or customize, refer to the repository for additional environment variables and configuration options.
Additional notes
Tips and considerations:
- Follow the authentication and credential guidance for each integrated provider (OAuth2, API keys, MTProto, etc.).
- Use the semantic routing features to minimize token usage by presenting only the relevant tools to the AI per task.
- Enable the automation scheduler to run periodic llm_tool_agent payloads or mcp_tool calls.
- Memory items have defined lifetimes; adjust TTLs if you need longer-term persistence across sessions.
- Check the audit logs for tool calls to understand AI actions and for debugging complex workflows.
- If you encounter routing issues, ensure the hub.tools.search and hub.tools.call endpoints are reachable and that embeddings are up-to-date.
- When upgrading, preserve memory stores and tool configurations to maintain continuity in long-running automations.
Related MCP Servers
flyto-core
The open-source execution engine for AI agents. 412 modules, MCP-native, triggers, queue, versioning, metering.
mikrotik
MCP server for Mikrotik
bitbucket
Bitbucket MCP - A Model Context Protocol (MCP) server for integrating with Bitbucket Cloud and Server APIs
oxylabs
Official Oxylabs MCP integration
pubmed
A Model Context Protocol (MCP) server enabling AI agents to intelligently search, retrieve, and analyze biomedical literature from PubMed via NCBI E-utilities. Includes a research agent scaffold. STDIO & HTTP
mcp-batchit
🚀 MCP aggregator for batching multiple tool calls into a single request. Reduces overhead, saves tokens, and simplifies complex operations in AI agent workflows.