programmatic-tool-calling-ai-sdk
⚡ Cut LLM inference costs 80% with Programmatic Tool Calling. Instead of N tool call round-trips, generate JavaScript to orchestrate tools in Vercel Sandbox. Supports Anthropic, OpenAI, 100+ models via AI Gateway. Novel MCP Bridge for external service integration.
claude mcp add --transport stdio cameronking4-programmatic-tool-calling-ai-sdk node server.js \ --env PORT="3000" \ --env MCP_HOST="http://localhost:3000" \ --env AI_SDK_KEY="your-ai-sdk-key" \ --env VERCEL_SANDBOX="true"
How to use
This MCP server provides a programmatic tool calling workflow built on top of the Vercel AI SDK. It enables LLMs to generate executable JavaScript code that orchestrates multiple tools in a single code execution, then runs that code in a sandbox and returns only the final result to the model. The approach reduces latency and token usage by avoiding multi-round tool calls and by parallelizing tool executions inside a secure sandbox. The server integrates MCP, the Vercel Sandbox environment, and the tool-wrapping utilities to support dynamic tool schemas and runtime safety checks.
You can use this server to orchestrate both local tools and MCP-compatible tools via generated code. It includes defenses like safeGet, toArray, and isSuccess helpers, along with an MCP Bridge that handles communication between the sandbox and MCP endpoints over HTTP, SSE, or stdio. Depending on your model provider, you can leverage direct providers (OpenAI, Anthropic) or the Vercel AI Gateway to access 100+ models, while benefiting from the code-driven orchestration pattern that minimizes context growth and improves latency.
How to install
Prerequisites:
- Node.js (≥ 18) and npm/yarn installed
- Git installed
- Access to the MCP endpoint or backend where the server will run
- Clone the repository
git clone https://github.com/cameronking4-programmatic-tool-calling-ai-sdk.git
cd cameronking4-programmatic-tool-calling-ai-sdk
- Install dependencies
npm install
# or
yarn install
- Configure environment (example)
- Create a .env file or export variables: PORT=3000 MCP_HOST=http://localhost:3000 VERCEL_SANDBOX=true AI_SDK_KEY=your-ai-sdk-key
4) Run the server
```bash
node server.js
# or with npm script (if provided):
npm run start
- Verify the MCP endpoint is reachable (default: http://localhost:3000)
curl http://localhost:3000/health
Additional notes
Tips and caveats:
- Ensure your AI SDK key and MCP endpoint are properly configured to enable model access and tool invocation.
- The sandbox runs Node.js 22 runtime; ensure your generated code adheres to the sandbox's supported APIs and async patterns.
- When debugging, enable verbose logging in the server to inspect code generation, tool wrappers, and MCP responses.
- If using MCP Bridge, confirm proper transport configuration (HTTP, SSE, or Stdio) in your client integration.
- For tooling, ensure your local tools are exposed with stable schemas (e.g., Zod or JSON Schema) to facilitate safe wrapping inside the generated code.
- Consider setting a reasonable timeout for code execution inside the sandbox to avoid runaway code.
- Review error handling helpers like isSuccess, safeGet, and extractText to gracefully manage irregular MCP responses.
Related MCP Servers
openmcp
Turn any openapi file into an mcp server, with just the tools you need.
modex
Modex is a Clojure MCP Library to augment your AI models with Tools, Resources & Prompts using Clojure (Model Context Protocol). Implements MCP Server & Client.
pluggedin-app
The Crossroads for AI Data Exchanges. A unified, self-hostable web interface for discovering, configuring, and managing Model Context Protocol (MCP) servers—bringing together AI tools, workspaces, prompts, and logs from multiple MCP sources (Claude, Cursor, etc.) under one roof.
rust -schema
A type-safe implementation of the official Model Context Protocol (MCP) schema in Rust.
create -app
A CLI tool for quickly scaffolding Model Context Protocol (MCP) server applications with TypeScript support and modern development tooling
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).