mcp-proxy
Aggregating MCP proxy with ~95% context reduction through progressive tool disclosure
claude mcp add --transport stdio iamsamuelrodda-mcp-proxy docker run -i iamsamuelrodda/mcp-proxy
How to use
mcp-proxy is an aggregating MCP proxy that exposes a minimal surface for Claude to interact with a large collection of MCP servers. It does not run the MCP servers directly; instead, it provides two meta-tools: get_tools_in_category to browse a hierarchical catalog of available tools, and execute_tool to run a selected tool by its path. The proxy preloads tools in the background to reduce latency, keeps the context window small, and gracefully disables servers that fail. To use it, start the proxy (via Docker in this configuration) and point Claude’s integration at its config. You can then navigate the tool tree with get_tools_in_category and execute specific tools through execute_tool, enabling broad functionality with low token usage.
How to install
Prerequisites:
- Docker installed and running
- Access to the MCP proxy image (iamsamuelrodda/mcp-proxy) on Docker Hub
Install steps:
-
Install Docker if not already installed (follow platform-specific instructions from the Docker site).
-
Pull and run the MCP proxy container using the provided configuration:
docker run -d --name mcp-proxy -p 8080:8080 -v /path/to/config.json:/config.json iamsamuelrodda/mcp-proxy
-
Prepare your MCP server configuration as described in the repository’s Quick Start, placing your mcp proxy config file (config.json) in a location accessible to the container, and adjust the command/args as needed for your setup. The config should include mcpProxy settings and a list of mcpServers with sources and transport types.
-
If you are integrating with Claude, update ~/.claude.json to point at the running mcp-proxy instance and ensure the proxy command and arguments reference the containerized endpoint or exposed port (e.g., http://localhost:8080).
-
Restart Claude Code or refresh its MCP connections to begin using the proxy.
Additional notes
Notes and tips:
- The proxy exposes two meta-tools only; most interactions should go through get_tools_in_category and execute_tool to minimize token usage.
- Ensure your config.json uses portable variable expansion (e.g., ${MCP_PROXY_DIR}) as described in the docs, so deployments are machine-independent.
- If a server becomes unavailable, the proxy will disable it gracefully and continue serving other tools.
- For security, consider enabling the secured setup options (OpenBao/Bitwarden) if you are deploying across multiple machines or in production.
- When updating configurations, re-run the bootstrap or redeploy the container to ensure the proxy loads the latest tool set and hierarchy.
Related MCP Servers
ollama
An MCP Server for Ollama
claude-code-open
Open source AI coding platform with Web IDE, multi-agent system, 37+ tools, MCP protocol. MIT licensed.
workflowy
Powerful CLI and MCP server for WorkFlowy: reports, search/replace, backup support, and AI integration (Claude, LLMs)
gtm
An MCP server for Google Tag Manager. Connect it to your LLM, authenticate once, and start managing GTM through natural language.
claude-vigil
🏺 An MCP server for checkpointing and file recovery in Claude Code
local-skills
Universal MCP server enabling any LLM or AI agent to utilize expert skills from your local filesystem. Reduces context consumption through lazy loading. Works with Claude, Cline, and any MCP-compatible client.