mcp-research-router
Intelligent MCP Research Router - Tool selection, search strategy planning, and parallel execution
claude mcp add --transport stdio spiritherb-mcp-research-router npx mcp-research-router
How to use
MCP Research Router acts as an aggregator and intelligent router for multiple MCP servers. It lets you connect once and access all integrated MCP tools, while automatically recommending the most suitable tool based on your request. It also supports batch parallel execution, enabling faster cross-tool results when you need to combine outputs. You can connect via MCP-compatible clients by loading a small configuration that points at the router, and you can enable an LLM-assisted workflow to get tool recommendations tailored to your query.
Typical usage is via the MCP client configuration, which can be provided through environment variables or a JSON config. At minimum, you connect to the router by spawning it with npx mcp-research-router (or via your orchestrator) and pointing the client to the router’s URL. When configured with LLM capabilities, you can request a list of available tools, ask for intelligent recommendations, and then batch-execute multiple tools in parallel to speed up complex tasks.
How to install
Prerequisites:
- Node.js (recommended latest LTS) and npm/yarn installed
- Internet access to install packages
Install and run (one of the following):
- Global install via npm:
npm install -g mcp-research-router
# then run
mcp-research-router
- Quick start using npx (no global install):
npx mcp-research-router
Configure the MCP client to connect to the router (see mcp_config example below). If you need to enable LLM support or point at custom servers, set the appropriate environment variables as described in the configuration notes.
Note: If you are integrating into a larger deployment, you can containerize the router or run it as a service behind a reverse proxy as needed.
Additional notes
Tips and notes:
- Recommended environment variables (for enabling LLM and connecting to a server): MCP_LLM_ENABLED, MCP_LLM_BASE_URL, MCP_LLM_API_KEY, MCP_LLM_MODEL, MCP_SERVER_ENABLED, MCP_SERVER_URL, MCP_SERVER_NAME, MCP_SERVER_HEADERS (JSON string).
- MCP_SERVER_HEADERS must be a valid JSON string, e.g. {"Authorization": "Bearer token"}.
- When specifying MCP_SERVER_URL, provide the endpoint where the router exposes MCP tools for your group.
- If you encounter issues with infinite nesting (adding the router to its own group), avoid configuring the router as part of the same MCP group in MCPHub.
- You can still use the router without LLM features; you will have access to tool lists and tool execution, but tool recommendation will be disabled.
- For deployment, consider using a process manager and a reverse proxy to expose the router securely to your clients.
Related MCP Servers
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
sdk-typescript
A model-driven approach to building AI agents in just a few lines of code.
mongodb-lens
🍃🔎 MongoDB Lens: Full Featured MCP Server for MongoDB Databases
openapi
OpenAPI definitions, converters and LLM function calling schema composer.
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
goai
AI SDK for building AI-powered applications in Go