mcp-batchit
🚀 MCP aggregator for batching multiple tool calls into a single request. Reduces overhead, saves tokens, and simplifies complex operations in AI agent workflows.
claude mcp add --transport stdio ryanjoachim-mcp-batchit node dist/index.js
How to use
MCP BatchIt exposes a single tool called batch_execute that lets you consolidate multiple MCP tool calls into one request. This reduces token usage and network chatter by executing sub-operations against a target MCP server in parallel, up to a configurable maxConcurrent limit. BatchIt handles spawning or connecting to the downstream MCP server behind the scenes and returns a single consolidated JSON result for the whole batch. It also provides controls such as stopOnError to halt remaining ops if one fails, and timeoutMs to bound each sub-op. You interact with BatchIt by sending a batch_execute request to the server’s tools/call endpoint (name=batch_execute). The server expects a structure with targetServer information, a list of operations (each specifying a tool and arguments), and options for parallelism and error handling. Typical usage includes batching filesystem-like tasks, API calls to other MCP servers, or any sequence of MCP tool invocations that can run in parallel.
How to install
Prerequisites:
- Node.js (v14+ recommended) and npm installed
- Git installed
Installation steps:
-
Clone the repository git clone https://github.com/ryanjoachim/mcp-batchit.git
-
Navigate to the project and install dependencies cd mcp-batchit npm install
-
Build the project (if a build step is defined) npm run build
-
Start the MCP BatchIt server npm start
-
Confirm the server is running (it should output a ready message and listen on STDIO for MCP tool calls).
Additional notes
Tips and notes:
- BatchIt’s design intentionally does not pass data between sub-ops within the same batch Execute. If a later sub-op depends on earlier results, chain multiple batch_execute calls instead.
- You can tune batch behavior with options like maxConcurrent, timeoutMs, and stopOnError. A higher maxConcurrent increases parallelism but may put more load on the downstream MCP server.
- Typical downstream targets are filesystem servers or other MCP services. BatchIt will spawn or connect to the chosen downstream server and execute all sub-ops in parallel up to the configured limits.
- Ensure the targetServer configuration for each operation correctly identifies the downstream MCP server type and transport, otherwise sub-ops may fail with tool not found or transport errors.
Related MCP Servers
langchainjs -adapters
** THIS REPO HAS MOVED TO https://github.com/langchain-ai/langchainjs/tree/main/libs/langchain-mcp-adapters ** Adapters for integrating Model Context Protocol (MCP) tools with LangChain.js applications, supporting both stdio and SSE transports.
mcp-gm
wanna develop an app âť“
IoT-Edge
MCP server for Industrial IoT, SCADA and PLC systems. Unifies MQTT sensors, Modbus devices and industrial equipment into a single AI-orchestrable API. Features real-time monitoring, alarms, time-series storage and actuator control.
cadre-ai
Your AI agent squad for Claude Code. 17 specialized agents, persistent memory, desktop automation, and a common sense engine.
agent-orchestration
Agent Orchestration: MCP server enabling multi-agent collaboration with shared memory, task queue, resource locks, Cursor rules, and AGENTS.md workflows.
xcode
MCP server for Xcode - enables AI assistants to create, build, test, and manage iOS/macOS projects programmatically