Want to make your Worker 10x faster?
Cache API is your friend. Here's how to use it in 5 lines:
The Code
export default {
async fetch(request: Request): Promise<Response> {
const cache = caches.default;
// Try to get from cache first
let response = await cache.match(request);
if (!response) {
// Cache miss - fetch from origin
response = await fetch(request);
// Cache for 1 hour
response = new Response(response.body, response);
response.headers.set('Cache-Control', 'max-age=3600');
// Store in cache
await cache.put(request, response.clone());
}
return response;
}
};
That's it. 5 lines (the important ones).
What This Does
- Checks if response is in cache
- If yes → returns cached version (super fast)
- If no → fetches from origin
- Stores in cache for next time
- Returns response
The Impact
Before caching:
- Every request hits your origin
- Response time: 200-500ms
- High CPU usage
After caching:
- Cached requests return in <10ms
- 70-90% cache hit rate is common
- Minimal CPU usage
Real Numbers
On my API:
- P50 latency: 450ms → 15ms
- P95 latency: 800ms → 25ms
- Cache hit rate: 85%
Pro Tips
- Only cache GET requests
- Set appropriate TTL (don't cache forever)
- Use cache keys for different variants
- Invalidate cache when data changes
Cache API is built into Workers. No setup. No cost. Just speed.
Want more performance patterns? I've got caching strategies, database optimization, and more in my complete guide: https://appybot.gumroad.com/l/oatoe
Top comments (0)