Get the FREE Ultimate OpenClaw Setup Guide →

mcp-batchit

🚀 MCP aggregator for batching multiple tool calls into a single request. Reduces overhead, saves tokens, and simplifies complex operations in AI agent workflows.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio ryanjoachim-mcp-batchit node dist/index.js

How to use

MCP BatchIt exposes a single tool called batch_execute that lets you consolidate multiple MCP tool calls into one request. This reduces token usage and network chatter by executing sub-operations against a target MCP server in parallel, up to a configurable maxConcurrent limit. BatchIt handles spawning or connecting to the downstream MCP server behind the scenes and returns a single consolidated JSON result for the whole batch. It also provides controls such as stopOnError to halt remaining ops if one fails, and timeoutMs to bound each sub-op. You interact with BatchIt by sending a batch_execute request to the server’s tools/call endpoint (name=batch_execute). The server expects a structure with targetServer information, a list of operations (each specifying a tool and arguments), and options for parallelism and error handling. Typical usage includes batching filesystem-like tasks, API calls to other MCP servers, or any sequence of MCP tool invocations that can run in parallel.

How to install

Prerequisites:

  • Node.js (v14+ recommended) and npm installed
  • Git installed

Installation steps:

  1. Clone the repository git clone https://github.com/ryanjoachim/mcp-batchit.git

  2. Navigate to the project and install dependencies cd mcp-batchit npm install

  3. Build the project (if a build step is defined) npm run build

  4. Start the MCP BatchIt server npm start

  5. Confirm the server is running (it should output a ready message and listen on STDIO for MCP tool calls).

Additional notes

Tips and notes:

  • BatchIt’s design intentionally does not pass data between sub-ops within the same batch Execute. If a later sub-op depends on earlier results, chain multiple batch_execute calls instead.
  • You can tune batch behavior with options like maxConcurrent, timeoutMs, and stopOnError. A higher maxConcurrent increases parallelism but may put more load on the downstream MCP server.
  • Typical downstream targets are filesystem servers or other MCP services. BatchIt will spawn or connect to the chosen downstream server and execute all sub-ops in parallel up to the configured limits.
  • Ensure the targetServer configuration for each operation correctly identifies the downstream MCP server type and transport, otherwise sub-ops may fail with tool not found or transport errors.

Related MCP Servers

Sponsor this space

Reach thousands of developers ↗