Get the FREE Ultimate OpenClaw Setup Guide →

programmatic-tool-calling-ai-sdk

⚡ Cut LLM inference costs 80% with Programmatic Tool Calling. Instead of N tool call round-trips, generate JavaScript to orchestrate tools in Vercel Sandbox. Supports Anthropic, OpenAI, 100+ models via AI Gateway. Novel MCP Bridge for external service integration.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio cameronking4-programmatic-tool-calling-ai-sdk node server.js \
  --env PORT="3000" \
  --env MCP_HOST="http://localhost:3000" \
  --env AI_SDK_KEY="your-ai-sdk-key" \
  --env VERCEL_SANDBOX="true"

How to use

This MCP server provides a programmatic tool calling workflow built on top of the Vercel AI SDK. It enables LLMs to generate executable JavaScript code that orchestrates multiple tools in a single code execution, then runs that code in a sandbox and returns only the final result to the model. The approach reduces latency and token usage by avoiding multi-round tool calls and by parallelizing tool executions inside a secure sandbox. The server integrates MCP, the Vercel Sandbox environment, and the tool-wrapping utilities to support dynamic tool schemas and runtime safety checks.

You can use this server to orchestrate both local tools and MCP-compatible tools via generated code. It includes defenses like safeGet, toArray, and isSuccess helpers, along with an MCP Bridge that handles communication between the sandbox and MCP endpoints over HTTP, SSE, or stdio. Depending on your model provider, you can leverage direct providers (OpenAI, Anthropic) or the Vercel AI Gateway to access 100+ models, while benefiting from the code-driven orchestration pattern that minimizes context growth and improves latency.

How to install

Prerequisites:

  • Node.js (≥ 18) and npm/yarn installed
  • Git installed
  • Access to the MCP endpoint or backend where the server will run
  1. Clone the repository
git clone https://github.com/cameronking4-programmatic-tool-calling-ai-sdk.git
cd cameronking4-programmatic-tool-calling-ai-sdk
  1. Install dependencies
npm install
# or
yarn install
  1. Configure environment (example)
  • Create a .env file or export variables: PORT=3000 MCP_HOST=http://localhost:3000 VERCEL_SANDBOX=true AI_SDK_KEY=your-ai-sdk-key

4) Run the server

```bash
node server.js
# or with npm script (if provided):
npm run start
  1. Verify the MCP endpoint is reachable (default: http://localhost:3000)
curl http://localhost:3000/health

Additional notes

Tips and caveats:

  • Ensure your AI SDK key and MCP endpoint are properly configured to enable model access and tool invocation.
  • The sandbox runs Node.js 22 runtime; ensure your generated code adheres to the sandbox's supported APIs and async patterns.
  • When debugging, enable verbose logging in the server to inspect code generation, tool wrappers, and MCP responses.
  • If using MCP Bridge, confirm proper transport configuration (HTTP, SSE, or Stdio) in your client integration.
  • For tooling, ensure your local tools are exposed with stable schemas (e.g., Zod or JSON Schema) to facilitate safe wrapping inside the generated code.
  • Consider setting a reasonable timeout for code execution inside the sandbox to avoid runaway code.
  • Review error handling helpers like isSuccess, safeGet, and extractText to gracefully manage irregular MCP responses.

Related MCP Servers

Sponsor this space

Reach thousands of developers