Cloudflare Workers let you deploy serverless APIs globally in seconds. No Docker, no Kubernetes, no cold starts. Here is how to go from zero to a production API in under an hour.
Why Cloudflare Workers?
- 0ms cold starts — V8 isolates, not containers
- Global by default — runs in 300+ edge locations
- Free tier — 100K requests/day
-
Simple deployment —
wrangler deployand done
Setup (5 Minutes)
# Install Wrangler CLI
npm install -g wrangler
# Login to Cloudflare
wrangler login
# Create a new project
wrangler init my-api
cd my-api
Build a REST API (20 Minutes)
// src/index.js
export default {
async fetch(request, env) {
const url = new URL(request.url);
const path = url.pathname;
// CORS headers
const corsHeaders = {
'Access-Control-Allow-Origin': '*',
'Content-Type': 'application/json'
};
if (request.method === 'OPTIONS') {
return new Response(null, { headers: corsHeaders });
}
// Routes
if (path === '/api/health') {
return Response.json({ status: 'ok', region: request.cf?.colo }, { headers: corsHeaders });
}
if (path === '/api/data') {
const data = await env.MY_KV.get('cached-data', 'json');
return Response.json({ data }, { headers: corsHeaders });
}
if (path === '/api/process' && request.method === 'POST') {
const body = await request.json();
const result = processData(body);
return Response.json(result, { headers: corsHeaders });
}
return Response.json({ error: 'Not found' }, { status: 404, headers: corsHeaders });
}
};
function processData(input) {
return {
processed: true,
timestamp: new Date().toISOString(),
input_size: JSON.stringify(input).length
};
}
Add KV Storage (10 Minutes)
# Create a KV namespace
wrangler kv:namespace create MY_KV
# Add to wrangler.toml
# [[kv_namespaces]]
# binding = "MY_KV"
# id = "your-namespace-id"
// Write and read from KV
await env.MY_KV.put('key', JSON.stringify({ value: 42 }));
const data = await env.MY_KV.get('key', 'json');
Deploy (2 Minutes)
# Deploy to production
wrangler deploy
# Your API is live at: https://my-api.your-subdomain.workers.dev
Add a Custom Domain (5 Minutes)
# In wrangler.toml
# routes = [
# { pattern = "api.yourdomain.com/*", zone_name = "yourdomain.com" }
# ]
wrangler deploy
Production Patterns
Rate Limiting
async function rateLimit(request, env) {
const ip = request.headers.get('CF-Connecting-IP');
const key = `rate:${ip}`;
const count = parseInt(await env.MY_KV.get(key) || '0');
if (count > 100) {
return Response.json({ error: 'Rate limited' }, { status: 429 });
}
await env.MY_KV.put(key, String(count + 1), { expirationTtl: 3600 });
return null; // Continue processing
}
Error Handling
try {
const result = await processRequest(request);
return Response.json(result);
} catch (err) {
return Response.json(
{ error: 'Internal error', message: err.message },
{ status: 500 }
);
}
Real Example: AI Spend Tracker
We built the AI Spend API entirely on Cloudflare Workers. It handles thousands of requests daily with sub-50ms latency globally.
# Try it
curl "https://api.lazy-mac.com/ai-spend/calculate?model=gpt-4&input_tokens=1000&output_tokens=500"
The entire stack: Workers + KV + D1 (SQLite). No servers to manage.
Top comments (0)