magg
Magg: The MCP Aggregator
claude mcp add --transport stdio sitbon-magg python -m magg serve \ --env MAGG_CONFIG_DIR="Path to custom Magg config directory (default: ~/.magg)" \ --env MAGG_PRIVATE_KEY="Optional RSA private key for HTTP authentication"
How to use
Magg is a meta MCP server that acts as a central hub for discovering, configuring, and proxying multiple MCP servers. It provides a unified interface to search for new MCP tools, install and configure them, and expose their capabilities under consolidated prefixes. With Magg, you can dynamically add or disable MCP servers, automatically aggregate their tools, and persist configuration across sessions. It also includes built-in commands like magg_status and magg_check to monitor health and readiness, making it easier to manage a growing toolkit of LLM capabilities.
To use Magg, install it as a Python tool and run the CLI. By default, Magg runs in stdio mode (for integration with desktop LLM clients). You can also serve over HTTP to enable system-wide access. Once running, Magg scans configured MCP servers, proxies their tools, and exposes them under a unified namespace. The Magg CLI supports handy commands and transports, including real-time MCP notifications, making it straightforward to add new servers and keep configurations in sync without restarting the service.
How to install
Prerequisites:
- Python 3.12 or higher (3.13+ recommended)
- uv (recommended) for running Magg as a tool (optional but recommended)
Install Magg as a Python package and run the CLI:
# Prerequisite: ensure Python 3.12+ is installed
python3 --version
# Option 1: Install Magg as a tool via uv (recommended way per docs)
# Install uv if you don't have it yet (on macOS/Linux):
# curl -sSf https://install.astral.sh/uv | sh
uv tool install magg
# Run Magg in stdio mode (default)
magg serve
# Run Magg with HTTP transport (exposes an HTTP API)
magg serve --http
Alternative: Run Magg directly from GitHub without installation:
# Run with stdio transport
uvx --from git+https://github.com/sitbon/magg.git magg
# Run with HTTP transport
uvx --from git+https://github.com/sitbon/magg.git magg serve --http
Local development (clone and install in editable mode):
git clone https://github.com/sitbon/magg.git
cd magg
# Install in development mode with dev dependencies
uv sync --dev
# or with poetry
poetry install --with dev
# Run the CLI to verify
magg --help
Docker (optional):
# Run production image (default port 8000)
docker run -p 8000:8000 ghcr.io/sitbon/magg:latest
Additional notes
Tips and common considerations:
- Magg stores configuration in .magg/config.json by default; you can customize the location via MAGG_CONFIG_DIR (see environment variables).
- If you enable HTTP transport, consider enabling MAGG_PRIVATE_KEY for RSA-based JWT authentication to secure HTTP access.
- Use the built-in magg_status and magg_check tools to monitor health and performance.
- Docker users can mount a custom config directory into /home/magg/.magg or use the MAGG_CONFIG_DIR environment variable to point Magg to a specific directory.
- When aggregating tools from multiple MCP servers, Magg proxies them under unified prefixes to simplify access for clients that don’t support dynamic updates.
- Ensure Python 3.12+ is used to take full advantage of Magg’s features and compatibility.
Related MCP Servers
TrendRadar
⭐AI-driven public opinion & trend monitor with multi-platform aggregation, RSS, and smart alerts.🎯 告别信息过载,你的 AI 舆情监控助手与热点筛选工具!聚合多平台热点 + RSS 订阅,支持关键词精准筛选。AI 翻译 + AI 分析简报直推手机,也支持接入 MCP 架构,赋能 AI 自然语言对话分析、情感洞察与趋势预测等。支持 Docker ,数据本地/云端自持。集成微信/飞书/钉钉/Telegram/邮件/ntfy/bark/slack 等渠道智能推送。
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
MCP-PostgreSQL-Ops
🔍Professional MCP server for PostgreSQL operations & monitoring: 30+ extension-independent tools for performance analysis, table bloat detection, autovacuum monitoring, schema introspection, and database management. Supports PostgreSQL 12-17.
rohlik
MCP server that lets you shop groceries across the Rohlik Group platforms (Rohlik.cz, Knuspr.de, Gurkerl.at, Kifli.hu, Sezamo.ro)
gtm
An MCP server for Google Tag Manager. Connect it to your LLM, authenticate once, and start managing GTM through natural language.
Python-Runtime-Interpreter
PRIMS is a lightweight, open-source Model Context Protocol (MCP) server that lets LLM agents safely execute arbitrary Python code in a secure, throw-away sandbox.