claude-worker-proxy
Convert Gemini and OpenAI APIs into Claude-like format and serve them via Cloudflare Workers. 🚀 Zero-config, streaming ready, tool calls, and one-click deployment.
claude mcp add --transport stdio darwin200420-claude-worker-proxy node src/worker.js \ --env LOG_LEVEL="info" \ --env CLAUDE_API_KEY="Your Claude API key" \ --env CLAUDE_API_BASE="https://api.claude.example" \ --env PROXY_TIMEOUT_MS="10000" \ --env PROXY_ALLOWED_ORIGINS="https://your-app.example.com"
How to use
Claude Worker Proxy is a lightweight Cloudflare Worker-based proxy that routes requests to Claude Code services at the edge. It is designed to minimize latency by handling routing, header normalization, and optional transformations close to users, while keeping configuration simple and maintainable. The proxy supports configuring allowed origins, timeouts, and logging, so you can tailor it to your deployment environment. After deployment, you can point your front-end to the proxy URL, which will forward requests to Claude's API and return responses with minimal processing.
To use the toolset, ensure you have a Claude API key and base URL configured. The worker accepts standard Claude Code API requests and transparently forwards them, applying any defined routing rules and header normalization. You can adjust runtime options such as allowed origins, request timeouts, and log verbosity to suit your security and performance needs. The setup is designed for quick deployment via Wrangler, allowing you to deploy to Cloudflare Workers with minimal boilerplate.
How to install
Prerequisites:
- Node.js (LTS) installed
- npm (comes with Node.js)
- Cloudflare Wrangler CLI installed
- A Cloudflare account with a zone and route to attach the worker
- Claude API credentials (API key and base URL)
Step-by-step installation:
-
Install Wrangler globally npm install -g wrangler
-
Authenticate with Cloudflare wrangler login
-
Create or configure your project
- If starting from an existing repo, navigate to the project directory.
- Ensure wrangler.toml is present and configured (account_id, zone_id, route, etc.).
-
Install dependencies (if needed) npm install
-
Configure environment variables
- In wrangler.toml, or via Wrangler environment settings, set: CLAUDE_API_KEY, CLAUDE_API_BASE, PROXY_ALLOWED_ORIGINS, PROXY_TIMEOUT_MS, LOG_LEVEL
-
Build and publish to Cloudflare Workers wrangler publish
-
Validate deployment
- Open the route URL in a browser or run curl against the route to ensure requests are proxied to Claude.
Notes:
- The repository layout includes src/ with the worker logic and a wrangler.toml for deployment. If you need to customize the entry point, modify src/worker.js (or the path used in wrangler.config).
Additional notes
Helpful tips:
- Keep CLAUDE_API_KEY secure; avoid embedding in public repositories.
- Use PROXY_TIMEOUT_MS to prevent long hangs; adjust based on your latency expectations.
- Set PROXY_ALLOWED_ORIGINS to restrict who can call the proxy; use a comma-separated list.
- Enable LOG_LEVEL as needed for debugging, but consider reducing it in production to minimize log volume.
- Regularly update to the latest release to incorporate security fixes and feature improvements from the upstream project.
- If you encounter deployment issues, check wrangler logs and Cloudflare Workers’ live logs for error messages.
Related MCP Servers
claude-code-guide
Claude Code Guide - Setup, Commands, workflows, agents, skills & tips-n-tricks
codecompanion-history.nvim
A history management extension for codecompanion AI chat plugin that enables saving, browsing and restoring chat sessions.
Wazuh
AI-powered security operations for Wazuh SIEM—use any MCP-compatible client to ask security questions in plain English. Faster threat detection, incident triage, and compliance checks with real-time monitoring and anomaly spotting. Production-ready MCP server for conversational SOC workflows.
Agent-Fusion
Agent Fusion is a local RAG semantic search engine that gives AI agents instant access to your code, documentation (Markdown, Word, PDF). Query your codebase from code agents without hallucinations. Runs 100% locally, includes a lightweight embedding model, and optional multi-agent task orchestration. Deploy with a single JAR
sugar
🍰 Sugar - The autonomous layer for AI coding agents
mcp-tasks
A comprehensive and efficient MCP server for task management with multi-format support (Markdown, JSON, YAML)