mcp-copilot
A meta MCP server that seamlessly scales LLMs to 1000+ MCP servers through automatic routing.
claude mcp add --transport stdio tshu-w-mcp-copilot uvx mcp-server-copilot --config ~/.config/mcp-server-copilot/config.json
How to use
The MCP Server Copilot is a meta coordinator designed to scale large language models across many MCP servers. It provides routing to the appropriate MCP servers (router-servers), routes queries to available tools across servers (route-tools), and executes specific tools on chosen servers (execute-tool). This setup enables you to discover relevant capabilities, aggregate responses, and orchestrate tool usage without exposing every internal server directly to the LLM. Copilot is implemented in Python and is distributed as mcp-server-copilot, making it easy to install via pip or run with uv for streamlined startup.
To use Copilot, first install via pip or run it with uv. Once running, you’ll interact with the tool through the three core components: router-servers to locate target servers based on a user query, route-tools to identify and select pertinent tools across the discovered servers, and execute-tool to perform a chosen tool on a specified server with provided parameters. You can configure how many results to return (top_k) for both servers and tools, enabling precise control over routing breadth and depth. Copilot’s behavior centers on dynamic routing and orchestration, allowing you to compose complex workflows across multiple MCP servers with minimal manual wiring.
How to install
Prerequisites:
- Python 3.10–3.12
- pip (or use uv/uvx as an alternative runtime)
Installation options:
- Install via pip (recommended for Python users):
pip install mcp-server-copilot
- Run directly as a module after installation:
python -m mcp_server_copilot
- If you prefer using uv for running the server, install uvx and run the package without a separate Python invocation:
pip install mcp-server-copilot # if needed to install the package first
Configure the server (see next section for config details). You can copy the sample config to ~/.config/mcp-server-copilot/config.json and populate the mcpServers section as shown in the README.
Additional notes
Tips and considerations:
- The configuration uses an MCP client format that directs uvx to run the mcp-server-copilot with a config file path. Ensure ~./config/mcp-server-copilot/config.json exists and is readable.
- Default routing behavior includes top_k limits (default 5) for both server and tool discovery. Adjust these values in the client config if you want broader or narrower searches.
- Tools exposed by Copilot include the core components: router-servers (to locate MCP servers), route-tools (to locate tools across servers), and execute-tool (to run a specific tool on a chosen server). Provide appropriate parameters in the execute-tool step to influence tool execution.
- If you encounter issues with config loading, verify the path to the config.json is correct and that the JSON is valid. Logs typically indicate if the UV runtime or Python module cannot be found or if configuration keys are missing.
- The project is Python-based; there is no npm package requirement. If you’re integrating client-side MCP configuration, you can adjust the mcpServers entry in your MCP client settings to point to Copilot via uvx as shown in the configuration examples.
Related MCP Servers
aws-cost-explorer
MCP server for understanding AWS spend
mcp-lite-dev
共学《MCP极简开发》项目代码
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.
mcp-aoai-web-browsing
A minimal Model Context Protocol 🖥️ server/client🧑💻with Azure OpenAI and 🌐 web browser control via Playwright.
mcp-manager
CLI tool for managing Model Context Protocol (MCP) servers in one place & using them across them different clients