π Caching β The Secret Weapon Behind Fast, Scalable Systems
π§ What is Caching?
Caching means storing frequently accessed or expensive-to-compute data in a faster medium (in-memory, browser, CDN, Redis) so future requests are served instantly.
How it works:
- Check cache β βοΈ If data is found β Cache Hit β return instantly β If not found β Cache Miss β fetch from DB/API β store β return
Caching = trading RAM for speed.
π€ Where Data Is Cached? (Cache Layers)
1. Application Cache (In your code)
- Memoization
- Function-level caching
- Example: caching expensive calculations
2. Server-Side / Distributed Cache
- Redis, Memcached
- For sharing cache across multiple backend instances
3. Database Cache
- Query caching
- Materialized views
- Example: PostgreSQL caching recent SELECT queries
4. Browser Cache
- Uses HTTP headers
- localStorage, sessionStorage, IndexedDB
5. CDN Cache (Global Edge Servers)
- Cloudflare, CloudFront
- Caches images, CSS, JS, static HTML at edge
π€¨ Why Do We Use Caching? (The Real Benefits)
β‘οΈ Performance Boost
Database query: ~100β300ms
Redis fetch: ~0.5β2ms
CDN fetch: ~20β50ms
Browser cache: ~0β1ms
π§ Scalability
Reduce database load by 80β90%.
π° Cost Savings
Lower DB reads, less compute, fewer servers.
π¨ Reliability & Fallback
If DB is slow or temporarily unavailable β cache can still serve data.
β¨ Better User Experience
Faster pages β lower bounce rates β higher conversion.
π² Real-World Examples
Twitter (X)
Trending topics are recalculated every few seconds, stored in Redis β millions of users see instantly.
Profile + feed metadata cached β reduces reads on sharded DB.
YouTube
Video metadata cached on edge β instant loads worldwide.
π Caching Patterns (VERY Important for Interviews + Real Systems)
1. Cache-Aside (Lazy Loading)
Most common in backend systems (Redis + Node.js).
Flow:
Check cache β if miss β fetch DB β store β return.
2. Read-Through Cache
Application always queries cache.
Cache itself fetches from DB on miss.
3. Write-Through Cache
Write to cache β cache writes to DB.
(Data always fresh, but slower writes)
4. Write-Back (Write-Behind)
Write to cache β return immediately β cache writes to DB later.
(Fast writes, but riskier)
π Cache Invalidation (One of the hardest problems)
This decides whether cache stays fresh or stale.
Invalidation Techniques:
- TTL (Time-to-live) Auto-expire after X seconds.
-
Manual Invalidate
redis.del('user:123') - Event-based On DB update β publish event β invalidate distributed caches.
-
Versioning
Add version numbers to keys:
posts:v2:latest
π₯ Common Issues & Solutions
1. Stale Data
Fix using TTL or events.
2. Cache Stampede / Thundering Herd
Millions request same βcold keyβ = DB burst.
Fix with:
- Locking (Redis SETNX)
- Stale-while-revalidate
- Request coalescing
3. Memory Pressure
Fix via eviction policies:
- LRU (least recently used)
- LFU
- FIFO
4. Cold Start
Warm cache using popular keys on boot.
π§ How to Use Redis Caching in Backend (Node.js / Express Example)
This is the simplest and most widely used pattern:
import express from "express";
import Redis from "ioredis";
import User from "./models/User.js";
const app = express();
const redis = new Redis();
app.get("/user/:id", async (req, res) => {
const id = req.params.id;
const key = `user:${id}`;
// Check cache
const cached = await redis.get(key);
if (cached) {
return res.json(JSON.parse(cached));
}
// Fetch DB
const user = await User.findById(id);
if (!user) return res.status(404).json({ message: "User not found" });
// Save to cache
await redis.set(key, JSON.stringify(user), "EX", 300); // 5 min TTL
return res.json(user);
});
app.put("/user/:id", async (req, res) => {
const id = req.params.id;
// Update DB logic here...
// Invalidate cache
await redis.del(`user:${id}`);
res.json({ message: "Updated + Cache invalidated" });
});
app.listen(3000);
π§ Frontend Caching (React / Next.js)
Use React Query / SWR for app-level caching.
Example with React Query:
const { data } = useQuery({
queryKey: ["user", id],
queryFn: () => axios.get(`/api/user/${id}`).then(res => res.data),
staleTime: 5 * 60 * 1000, // 5 min fresh
});
Browser-level caching via headers:
Cache-Control: public, max-age=3600
ETag
Last-Modified
Storage API options:
- localStorage (persistent)
- sessionStorage (session only)
- IndexedDB (large data)
- Service Workers (PWA offline caching)
- Next.js built-in image caching
π€¨ CDN Caching
Used for:
- Static assets (JS, CSS, images)
- API responses (if allowed)
- Edge caching for SSR (Next.js + Cloudflare)
Cache-Control example for static assets:
Cache-Control: public, max-age=31536000, immutable
π₯ When Should You Use Caching?
Use caching when:
βοΈ Same data requested repeatedly
βοΈ DB/API becomes bottleneck
βοΈ Read-heavy systems
βοΈ Expensive computations
βοΈ Static/semi-static data
βοΈ Large traffic spikes
Avoid caching:
β Highly volatile data (stock prices, payments)
β Data requiring strong consistency
β User-sensitive content on shared caches
If you're following along with this Architecture Series and happened to miss the earlier parts, no worries β you can dive into them anytime. Each part builds your foundation step by step. Here are the previous topics we covered:
1οΈβ£ Pagination β Architecture Series: Part 1
2οΈβ£ Indexing β Architecture Series: Part 2
3οΈβ£ Virtualization β Architecture Series: Part 3
Top comments (0)