Caching Strategies for APIs: TTL, Stale-While-Revalidate, and Cache Invalidation
There are two hard problems in computer science: cache invalidation and naming things. Here's how to solve the first one.
Why Cache?
- Database queries are slow (1-50ms each)
- Some data doesn't change often (product catalog, user profiles)
- Traffic spikes shouldn't hit your DB directly
A cache hit serves the same response in <1ms that otherwise costs 20ms. At scale, this is the difference between your system staying up or falling over.
Strategy 1: Simple TTL
Cache the result for N seconds, then expire:
async function getProduct(productId: string): Promise<Product> {
const cacheKey = `product:${productId}`;
// Check cache first
const cached = await redis.get(cacheKey);
if (cached) return JSON.parse(cached);
// Cache miss — fetch from DB
const product = await db.products.findUnique({ where: { id: productId } });
// Store with 5-minute TTL
await redis.setex(cacheKey, 300, JSON.stringify(product));
return product;
}
Good for: data that can tolerate being slightly stale (product descriptions, user preferences)
Bad for: real-time data (stock prices, unread counts)
Strategy 2: Stale-While-Revalidate
Serve stale data immediately, refresh in background:
async function getWithSWR<T>(
key: string,
fetchFn: () => Promise<T>,
ttl: number,
staleWindow: number
): Promise<T> {
const cached = await redis.get(key);
const metadata = await redis.get(`${key}:meta`);
if (cached) {
const { cachedAt } = JSON.parse(metadata || '{}');
const age = Date.now() - cachedAt;
if (age > ttl && age < ttl + staleWindow) {
// Serve stale, refresh in background (don't await)
refreshCache(key, fetchFn, ttl).catch(console.error);
}
return JSON.parse(cached);
}
// No cache — fetch and store
return refreshCache(key, fetchFn, ttl);
}
async function refreshCache<T>(key: string, fetchFn: () => Promise<T>, ttl: number): Promise<T> {
const data = await fetchFn();
await redis.setex(key, ttl + 60, JSON.stringify(data));
await redis.setex(`${key}:meta`, ttl + 60, JSON.stringify({ cachedAt: Date.now() }));
return data;
}
Good for: dashboards, feeds — users get instant response, data is almost always fresh
Strategy 3: Event-Based Invalidation
Instead of TTLs, invalidate the cache when data actually changes:
// When a product is updated
async function updateProduct(productId: string, updates: Partial<Product>) {
// Update database
const updated = await db.products.update({
where: { id: productId },
data: updates,
});
// Invalidate all related cache keys
await Promise.all([
redis.del(`product:${productId}`),
redis.del(`category:${updated.categoryId}:products`),
redis.del('products:featured'),
]);
return updated;
}
Good for: data that must be immediately consistent after writes
Challenge: you need to track all cache keys that depend on a piece of data
HTTP Caching Headers
Don't forget browser/CDN caching — it's free performance:
// Express response with cache headers
app.get('/api/products/:id', async (req, res) => {
const product = await getProduct(req.params.id);
res.set({
'Cache-Control': 'public, max-age=60, stale-while-revalidate=300',
'ETag': hashObject(product),
'Last-Modified': product.updatedAt.toUTCString(),
});
res.json(product);
});
Caching and MCP APIs
When building MCP tools that proxy external APIs, caching is critical — tool calls happen frequently during AI agent workflows and external rate limits are brutal.
The Workflow Automator MCP includes built-in Redis caching for all external API calls so your automations don't hammer rate limits or slow down agent workflows.
Top comments (0)