Get the FREE Ultimate OpenClaw Setup Guide →

cloudflare-workers

npx machina-cli add skill tenequm/claude-plugins/cloudflare-workers --openclaw
Files (1)
SKILL.md
13.4 KB

Cloudflare Workers

Overview

Cloudflare Workers is a serverless execution environment that runs JavaScript, TypeScript, Python, and Rust code on Cloudflare's global network. Workers execute in milliseconds, scale automatically, and integrate with Cloudflare's storage and compute products through bindings.

Key Benefits:

  • Zero cold starts - Workers run in V8 isolates, not containers
  • Global deployment - Code runs in 300+ cities worldwide
  • Rich ecosystem - Bindings to D1, KV, R2, Durable Objects, Queues, Containers, Workflows, and more
  • Full-stack capable - Build APIs and serve static assets in one project
  • Standards-based - Uses Web APIs (fetch, crypto, streams, WebSockets)

When to Use This Skill

Use Cloudflare Workers for:

  • APIs and backends - RESTful APIs, GraphQL, tRPC, WebSocket servers
  • Full-stack applications - React, Next.js, Remix, Astro, Vue, Svelte with static assets
  • Edge middleware - Authentication, rate limiting, A/B testing, routing
  • Background processing - Scheduled jobs (cron), queue consumers, webhooks
  • Data transformation - ETL pipelines, real-time data processing
  • AI applications - RAG systems, chatbots, image generation with Workers AI
  • Durable workflows - Multi-step long-running tasks with automatic retries (Workflows)
  • Container workloads - Run Docker containers alongside Workers (Containers)
  • MCP servers - Host remote Model Context Protocol servers
  • Proxy and gateway - API gateways, content transformation, protocol translation

Quick Start Workflow

1. Install Wrangler CLI

npm install -g wrangler

# Login to Cloudflare
wrangler login

2. Create a New Worker

# Using C3 (create-cloudflare) - recommended
npm create cloudflare@latest my-worker

# Or create manually
wrangler init my-worker
cd my-worker

3. Write Your Worker

Basic HTTP API (TypeScript):

export default {
  async fetch(request: Request, env: Env): Promise<Response> {
    const url = new URL(request.url);

    if (url.pathname === "/api/hello") {
      return Response.json({ message: "Hello from Workers!" });
    }

    return new Response("Not found", { status: 404 });
  },
};

With environment variables and KV:

interface Env {
  MY_VAR: string;
  MY_KV: KVNamespace;
}

export default {
  async fetch(request: Request, env: Env): Promise<Response> {
    // Access environment variable
    const greeting = env.MY_VAR;

    // Read from KV
    const value = await env.MY_KV.get("my-key");

    return Response.json({ greeting, value });
  },
};

4. Develop Locally

# Start local development server with hot reload
wrangler dev

# Access at http://localhost:8787

5. Deploy to Production

# Deploy to workers.dev subdomain
wrangler deploy

# Deploy to custom domain (configure in wrangler.toml)
wrangler deploy

Core Concepts

Workers Runtime

Workers use the V8 JavaScript engine with Web Standard APIs:

  • Execution model: Isolates (not containers) - instant cold starts
  • CPU time limit: 10ms (Free), 30s (Paid) per request
  • Memory limit: 128 MB per isolate
  • Languages: JavaScript, TypeScript, Python, Rust
  • APIs: fetch, crypto, streams, WebSockets, WebAssembly

Supported APIs:

  • Fetch API (HTTP requests)
  • URL API (URL parsing)
  • Web Crypto (encryption, hashing)
  • Streams API (data streaming)
  • WebSockets (real-time communication)
  • Cache API (edge caching)
  • HTML Rewriter (HTML transformation)

Handlers

Workers respond to events through handlers:

Fetch Handler (HTTP requests):

export default {
  async fetch(request: Request, env: Env, ctx: ExecutionContext) {
    return new Response("Hello!");
  },
};

Scheduled Handler (cron jobs):

export default {
  async scheduled(event: ScheduledEvent, env: Env, ctx: ExecutionContext) {
    // Runs on schedule defined in wrangler.toml
    await env.MY_KV.put("last-run", new Date().toISOString());
  },
};

Queue Handler (message processing):

export default {
  async queue(batch: MessageBatch<any>, env: Env, ctx: ExecutionContext) {
    for (const message of batch.messages) {
      await processMessage(message.body);
      message.ack();
    }
  },
};

Bindings

Bindings connect your Worker to Cloudflare resources. Configure in wrangler.toml:

KV (Key-Value Storage):

[[kv_namespaces]]
binding = "MY_KV"
id = "your-kv-namespace-id"
// Usage
await env.MY_KV.put("key", "value");
const value = await env.MY_KV.get("key");

D1 (SQL Database):

[[d1_databases]]
binding = "DB"
database_name = "my-database"
database_id = "your-database-id"
// Usage
const result = await env.DB.prepare(
  "SELECT * FROM users WHERE id = ?"
).bind(userId).all();

R2 (Object Storage):

[[r2_buckets]]
binding = "MY_BUCKET"
bucket_name = "my-bucket"
// Usage
await env.MY_BUCKET.put("file.txt", "contents");
const object = await env.MY_BUCKET.get("file.txt");
const text = await object?.text();

Environment Variables:

[vars]
API_KEY = "development-key"  # pragma: allowlist secret

Secrets (sensitive data):

# Set via CLI (not in wrangler.toml)
wrangler secret put API_KEY

Context (ctx)

The ctx parameter provides control over request lifecycle:

export default {
  async fetch(request: Request, env: Env, ctx: ExecutionContext) {
    // Run tasks after response is sent
    ctx.waitUntil(
      env.MY_KV.put("request-count", String(Date.now()))
    );

    // Pass through to origin on exception
    ctx.passThroughOnException();

    return new Response("OK");
  },
};

Top-level Environment Access

Since March 2025, you can import env at the module level instead of passing it through handlers:

import { env } from "cloudflare:workers";

// Access bindings outside of handlers
const apiClient = new ApiClient({ apiKey: env.API_KEY });

export default {
  async fetch(request: Request): Promise<Response> {
    // env is also available here without the parameter
    const data = await env.MY_KV.get("config");
    return Response.json({ data });
  },
};

This eliminates prop-drilling env through function signatures and enables module-level initialization.

Rapid Development Patterns

Wrangler Configuration

Essential wrangler.toml:

name = "my-worker"
main = "src/index.ts"
compatibility_date = "2025-09-01"

# Custom domain
routes = [
  { pattern = "api.example.com/*", zone_name = "example.com" }
]

# Or workers.dev subdomain
workers_dev = true

# Environment variables
[vars]
ENVIRONMENT = "production"

# Bindings
[[kv_namespaces]]
binding = "CACHE"
id = "your-kv-id"

[[d1_databases]]
binding = "DB"
database_name = "production-db"
database_id = "your-db-id"

[[r2_buckets]]
binding = "ASSETS"
bucket_name = "my-assets"

# Cron triggers
[triggers]
crons = ["0 0 * * *"]  # Daily at midnight

Environment Management

Use environments for staging/production:

[env.staging]
vars = { ENVIRONMENT = "staging" }

[env.staging.d1_databases]
binding = "DB"
database_name = "staging-db"
database_id = "staging-db-id"

[env.production]
vars = { ENVIRONMENT = "production" }

[env.production.d1_databases]
binding = "DB"
database_name = "production-db"
database_id = "production-db-id"
# Deploy to staging
wrangler deploy --env staging

# Deploy to production
wrangler deploy --env production

Common Patterns

JSON API with Error Handling:

export default {
  async fetch(request: Request, env: Env): Promise<Response> {
    try {
      const url = new URL(request.url);

      if (url.pathname === "/api/users" && request.method === "GET") {
        const users = await env.DB.prepare("SELECT * FROM users").all();
        return Response.json(users.results);
      }

      if (url.pathname === "/api/users" && request.method === "POST") {
        const body = await request.json();
        await env.DB.prepare(
          "INSERT INTO users (name, email) VALUES (?, ?)"
        ).bind(body.name, body.email).run();
        return Response.json({ success: true }, { status: 201 });
      }

      return Response.json({ error: "Not found" }, { status: 404 });
    } catch (error) {
      return Response.json(
        { error: error.message },
        { status: 500 }
      );
    }
  },
};

Authentication Middleware:

async function authenticate(request: Request, env: Env): Promise<string | null> {
  const authHeader = request.headers.get("Authorization");
  if (!authHeader?.startsWith("Bearer ")) {
    return null;
  }

  const token = authHeader.substring(7);
  const userId = await env.SESSIONS.get(token);
  return userId;
}

export default {
  async fetch(request: Request, env: Env): Promise<Response> {
    const userId = await authenticate(request, env);

    if (!userId) {
      return Response.json({ error: "Unauthorized" }, { status: 401 });
    }

    // Proceed with authenticated request
    return Response.json({ userId });
  },
};

CORS Headers:

const corsHeaders = {
  "Access-Control-Allow-Origin": "*",
  "Access-Control-Allow-Methods": "GET, POST, PUT, DELETE, OPTIONS",
  "Access-Control-Allow-Headers": "Content-Type, Authorization",
};

export default {
  async fetch(request: Request): Promise<Response> {
    if (request.method === "OPTIONS") {
      return new Response(null, { headers: corsHeaders });
    }

    const response = await handleRequest(request);

    // Add CORS headers to response
    Object.entries(corsHeaders).forEach(([key, value]) => {
      response.headers.set(key, value);
    });

    return response;
  },
};

Static Assets (Full-Stack Apps)

Serve static files alongside your Worker code:

[assets]
directory = "./public"
binding = "ASSETS"
export default {
  async fetch(request: Request, env: Env): Promise<Response> {
    const url = new URL(request.url);

    // API routes
    if (url.pathname.startsWith("/api/")) {
      return handleAPI(request, env);
    }

    // Serve static assets via the ASSETS binding
    return env.ASSETS.fetch(request);
  },
};

Testing

Using Vitest:

import { env, createExecutionContext } from "cloudflare:test";
import { describe, it, expect } from "vitest";
import worker from "./index";

describe("Worker", () => {
  it("responds with JSON", async () => {
    const request = new Request("http://example.com/api/hello");
    const ctx = createExecutionContext();
    const response = await worker.fetch(request, env, ctx);

    expect(response.status).toBe(200);
    expect(await response.json()).toEqual({ message: "Hello!" });
  });
});

Framework Integration

Workers supports major frameworks with adapters:

  • Next.js - Full App Router and Pages Router support
  • Remix / React Router - Native Cloudflare adapter
  • Astro - Server-side rendering on Workers
  • SvelteKit - Cloudflare adapter available
  • Hono - Lightweight web framework built for Workers
  • tRPC - Type-safe APIs with full Workers support

Example with Hono:

import { Hono } from "hono";

const app = new Hono();

app.get("/", (c) => c.text("Hello!"));
app.get("/api/users/:id", async (c) => {
  const id = c.req.param("id");
  const user = await c.env.DB.prepare(
    "SELECT * FROM users WHERE id = ?"
  ).bind(id).first();
  return c.json(user);
});

export default app;

Advanced Topics

For detailed information on advanced features, see the reference files:

  • Complete Bindings Guide: references/bindings-complete-guide.md - All binding types (D1, KV, R2, Durable Objects, Queues, Workers AI, Vectorize, Workflows, Containers, Secrets Store, Pipelines, AutoRAG)
  • Deployment & CI/CD: references/wrangler-and-deployment.md - Wrangler v4 migration, commands, GitHub Actions, GitLab CI/CD, gradual rollouts, remote bindings
  • Development Best Practices: references/development-patterns.md - Testing, debugging, error handling, performance, top-level env access patterns
  • Advanced Features: references/advanced-features.md - Containers, Workflows, MCP servers, Workers for Platforms, WebSockets, Node.js compat, streaming
  • Observability: references/observability.md - Logging (tail, Logpush, Workers Logs), metrics, traces, debugging

Resources

Official Documentation:

Templates & Quick Starts:

Community:

Source

git clone https://github.com/tenequm/claude-plugins/blob/main/cloudflare-workers/SKILL.mdView on GitHub

Overview

Cloudflare Workers provides a serverless execution environment that runs JavaScript, TypeScript, Python, and Rust on Cloudflare's global network. It enables zero cold starts, automatic scaling, and deep bindings to storage and compute products, making it ideal for APIs, full-stack apps, and edge workloads.

How This Skill Works

Code runs inside V8 isolates on Cloudflare's edge, delivering fast, containerless execution with instant cold starts. Developers write Workers using standard Web APIs and connect to storage/services via bindings (D1, KV, R2, Durable Objects, Queues, and more). The platform automatically scales across 300+ cities and supports multiple languages.

When to Use It

  • APIs and backends (REST, GraphQL, tRPC, WebSocket servers)
  • Full-stack applications (React, Next.js, Remix, Astro, Vue, Svelte) with static assets
  • Edge middleware (authentication, rate limiting, A/B testing, routing)
  • Background processing (scheduled jobs, queue consumers, webhooks)
  • Data transformation and AI at the edge (ETL, real-time processing, RAG chatbots, image generation with Workers AI)

Quick Start

  1. Step 1: Install Wrangler CLI and login (npm install -g wrangler; wrangler login)
  2. Step 2: Create a new Worker (wrangler init my-worker or npm create cloudflare@latest my-worker)
  3. Step 3: Start local development and deploy (wrangler dev; wrangler deploy)

Best Practices

  • Use bindings to access storage and services (D1, KV, R2, Durable Objects, Queues, Workflows) to minimize external calls
  • Keep per-request logic small and fast to avoid hitting CPU/time limits and to reduce latency
  • Leverage Durable Objects for shared state and Workflows for long-running tasks with retries
  • Version and configure projects with wrangler.toml; test locally with wrangler dev
  • Cache responses with the Cache API and use HTML Rewriter for content transformation at the edge

Example Use Cases

  • Edge REST/GraphQL API gateway backed by KV or D1, deployed in 300+ cities
  • Static site with dynamic edge APIs using Next.js/Remix and edge data fetch
  • Authentication, rate limiting, and routing middleware at the edge
  • Cron-based scheduled jobs, webhook consumers, and queue processing at the edge
  • RAG-enabled chatbots or image generation tasks powered by Workers AI

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers