Running an API usually means paying for servers. But Cloudflare Workers lets you deploy production-ready APIs on a generous free tier: 100,000 requests per day, zero cold starts, and global edge deployment. Here's how to build one from scratch.
What You Get for Free
Cloudflare Workers free tier includes:
- 100,000 requests/day
- 10ms CPU time per request (enough for most APIs)
- KV Storage: 100,000 reads/day, 1,000 writes/day
- Global deployment across 300+ data centers
- Custom domains via Cloudflare DNS
For personal projects, internal tools, and moderate-traffic APIs, this is more than enough.
Setting Up
Install Wrangler
Wrangler is Cloudflare's CLI for Workers development.
npm install -g wrangler
wrangler login
Create a New Project
wrangler init my-api
cd my-api
This generates a basic project with wrangler.toml and src/index.js.
Configure wrangler.toml
name = "my-api"
main = "src/index.js"
compatibility_date = "2026-03-01"
[vars]
API_VERSION = "1.0.0"
[[kv_namespaces]]
binding = "CACHE"
id = "your-kv-namespace-id"
Create the KV namespace:
wrangler kv namespace create CACHE
# Copy the output id into wrangler.toml
Building the Router
Workers receive a fetch event. Here's a clean routing pattern without any external dependencies:
// src/index.js
export default {
async fetch(request, env, ctx) {
const url = new URL(request.url);
const path = url.pathname;
const method = request.method;
// CORS headers
if (method === 'OPTIONS') {
return handleCors();
}
try {
// Route matching
if (path === '/api/health' && method === 'GET') {
return json({ status: 'ok', version: env.API_VERSION });
}
if (path === '/api/data' && method === 'GET') {
return await handleGetData(url, env);
}
if (path === '/api/data' && method === 'POST') {
return await handlePostData(request, env);
}
return json({ error: 'Not found' }, 404);
} catch (err) {
return json({ error: 'Internal server error' }, 500);
}
}
};
function json(data, status = 200) {
return new Response(JSON.stringify(data), {
status,
headers: {
'Content-Type': 'application/json',
'Access-Control-Allow-Origin': '*',
},
});
}
function handleCors() {
return new Response(null, {
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, X-API-Key',
'Access-Control-Max-Age': '86400',
},
});
}
Using KV Store for Caching
KV Store is a global key-value database, perfect for caching API responses.
async function handleGetData(url, env) {
const query = url.searchParams.get('q');
if (!query) {
return json({ error: 'Missing query parameter: q' }, 400);
}
// Check cache first
const cacheKey = `data:${query}`;
const cached = await env.CACHE.get(cacheKey, { type: 'json' });
if (cached) {
return json({ ...cached, cached: true });
}
// Generate or fetch data
const result = await fetchExternalData(query);
// Cache for 1 hour (3600 seconds)
await env.CACHE.put(cacheKey, JSON.stringify(result), {
expirationTtl: 3600,
});
return json({ ...result, cached: false });
}
async function handlePostData(request, env) {
const body = await request.json();
if (!body.name || !body.value) {
return json({ error: 'Missing required fields: name, value' }, 400);
}
const id = crypto.randomUUID();
const record = { id, ...body, createdAt: new Date().toISOString() };
await env.CACHE.put(`record:${id}`, JSON.stringify(record));
return json(record, 201);
}
Implementing Rate Limiting
Protect your API from abuse using a sliding window counter stored in KV:
async function rateLimit(request, env, limit = 60, window = 60) {
const ip = request.headers.get('CF-Connecting-IP');
const key = `ratelimit:${ip}`;
const current = await env.CACHE.get(key, { type: 'json' });
const now = Math.floor(Date.now() / 1000);
if (current) {
// Clean old entries outside the window
const windowStart = now - window;
const hits = current.timestamps.filter(t => t > windowStart);
if (hits.length >= limit) {
return {
limited: true,
response: json({
error: 'Rate limit exceeded',
retryAfter: window - (now - hits[0])
}, 429)
};
}
hits.push(now);
await env.CACHE.put(key, JSON.stringify({ timestamps: hits }), {
expirationTtl: window * 2,
});
} else {
await env.CACHE.put(key, JSON.stringify({ timestamps: [now] }), {
expirationTtl: window * 2,
});
}
return { limited: false };
}
Integrate it into your main handler:
// Inside the fetch handler, before routing:
const { limited, response } = await rateLimit(request, env);
if (limited) return response;
Adding API Key Authentication
For APIs that need access control, implement API key validation:
function authenticate(request, env) {
const apiKey = request.headers.get('X-API-Key');
if (!apiKey) {
return { authenticated: false, error: 'Missing X-API-Key header' };
}
// Store valid keys as a secret
const validKeys = (env.API_KEYS || '').split(',');
if (!validKeys.includes(apiKey)) {
return { authenticated: false, error: 'Invalid API key' };
}
return { authenticated: true };
}
Set your API keys as secrets (never in wrangler.toml):
wrangler secret put API_KEYS
# Enter: key1,key2,key3
Apply authentication to protected routes:
if (path.startsWith('/api/admin')) {
const auth = authenticate(request, env);
if (!auth.authenticated) {
return json({ error: auth.error }, 401);
}
}
Local Development
Wrangler provides a local development server that simulates the Workers environment:
# Start local dev server
wrangler dev
# Test your endpoints
curl http://localhost:8787/api/health
curl -X POST http://localhost:8787/api/data \
-H "Content-Type: application/json" \
-d '{"name": "test", "value": 42}'
Deploying
One command to deploy globally:
wrangler deploy
Your API is now live at https://my-api.<your-subdomain>.workers.dev.
Custom Domain Setup
If your domain is on Cloudflare DNS, add a route in wrangler.toml:
routes = [
{ pattern = "api.yourdomain.com/*", zone_name = "yourdomain.com" }
]
Staying Within Free Tier Limits
Here are practical strategies to maximize the free tier:
| Strategy | Impact |
|---|---|
| Cache aggressively with KV | Reduces external API calls |
Use Cache-Control headers |
Lets Cloudflare CDN handle repeat requests |
| Rate limit per IP | Prevents one user from burning your quota |
| Return early for invalid requests | Saves CPU time on bad input |
Use ctx.waitUntil() for non-blocking writes |
Keeps response times fast |
// ctx.waitUntil example: log without blocking the response
ctx.waitUntil(
env.CACHE.put(`log:${Date.now()}`, JSON.stringify({
path: url.pathname,
ip: request.headers.get('CF-Connecting-IP'),
timestamp: new Date().toISOString()
}))
);
return json({ data: result });
Production Checklist
Before going live, verify:
- [ ] CORS headers configured for your frontend domain
- [ ] Rate limiting enabled
- [ ] Input validation on all POST/PUT endpoints
- [ ] Secrets stored via
wrangler secret, not in code - [ ] Error responses don't leak internal details
- [ ] Cache TTLs set appropriately
- [ ] Monitoring set up in Cloudflare dashboard
Wrapping Up
Cloudflare Workers is a powerful platform for building APIs with zero infrastructure cost. The combination of edge deployment, KV storage, and a generous free tier makes it ideal for developer tools, webhooks, and lightweight services.
If you want to see a real-world example of this approach, check out OG API - an open-source API built on Cloudflare Workers that generates Open Graph images dynamically, using the exact patterns covered in this guide.
What are you building with Cloudflare Workers? Share your projects in the comments!
Top comments (0)