Getting Started with OpenClaw on Cloudflare Workers

Deploy your first AI agent in under 5 minutes. No servers, no complexity.

January 28, 2025

OpenClaw runs on Cloudflare Workers, which means you get a globally distributed AI agent without managing any infrastructure. Here's how to get started.

Why Cloudflare Workers?

Cloudflare Workers is an edge computing platform that runs your code in data centers around the world. For OpenClaw, this means your AI agent responds from the location closest to wherever you (or your users) are.

Pros

  • No server management: No VPS to maintain, no OS updates, no security patches. Cloudflare handles all of it.
  • Global edge network: Your agent runs in 300+ locations worldwide. Requests route to the nearest data center automatically.
  • Generous free tier: 100,000 requests per day free. Most personal users never pay anything.
  • Built-in DDoS protection: Cloudflare's network absorbs attacks before they reach your worker.
  • Zero cold starts: Workers are always warm. Your agent responds instantly, no spin-up delay.
  • Simple deployment: One command (wrangler deploy) and you're live.
  • Automatic HTTPS: SSL certificates are handled automatically on any custom domain.

Cons

  • Execution time limits: Workers have a 30-second CPU time limit (50ms on free tier). Long-running tasks need to be broken up or use Durable Objects.
  • No persistent filesystem: You can't store files locally. Use R2 (Cloudflare's S3-compatible storage) or external services.
  • Memory limits: 128MB memory limit. Fine for most AI agent tasks, but heavy data processing may hit limits.
  • Vendor lock-in: Worker-specific APIs mean some code won't port directly to other platforms.
  • Debugging can be tricky: No traditional server logs. You'll use wrangler tail for real-time logs or Logpush for persistence.

When Cloudflare Workers is the right choice

  • You want zero infrastructure headaches
  • You're building a personal AI assistant or small team tool
  • You need global availability without complexity
  • Your workload is request-response (not long-running batch jobs)

When to consider alternatives

  • You need long-running processes (use a VPS instead)
  • You want to run local AI models (Workers can't run LLMs locally)
  • You're processing large files or need significant compute (consider a dedicated server)

Prerequisites

You'll need a few things before we begin:

  • A Cloudflare account (free tier works)
  • A domain or subdomain (free workers.dev subdomain available)
  • An API key from your preferred AI provider (Claude, OpenAI, etc.)
  • Node.js installed locally

Step 1: Clone the Repository

Start by cloning the OpenClaw worker template:

zsh
 git clone https://github.com/openclaw/openclaw-worker
 cd openclaw-worker
 npm install

Step 2: Configure Your Secrets

Add your API keys to Cloudflare:

zsh
 npx wrangler secret put ANTHROPIC_API_KEY

Enter your API key when prompted. You can also set up OpenRouter for multi-model routing:

zsh
 npx wrangler secret put OPENROUTER_API_KEY

Step 3: Deploy

Deploy to Cloudflare with a single command:

zsh
 npx wrangler deploy

Your AI agent is now live at your workers.dev subdomain.

Step 4: Connect Your Domain (Optional)

If you want a custom domain instead of *.workers.dev:

  1. Add your domain to Cloudflare (if not already)
  2. Update wrangler.toml with your route
  3. Redeploy
toml
routes = [
  { pattern = "ai.yourdomain.com", zone_name = "yourdomain.com" }
]

What's Next?

Now that your agent is running, you can:

  • Add skills: Install pre-built automations from the skills directory
  • Connect messaging apps: Link Telegram, Slack, or Discord
  • Configure workflows: Set up triggers and scheduled tasks
  • Set up Durable Objects: For stateful conversations and persistent memory

Check out the skills directory to see what your AI can do, or read our VPS deployment guide if you need more compute power than Workers can provide.

Share this post

Related Posts