Akash, the founder of ShopStream, was feeling good. He had broken his Monolith into Microservices and set up a smart Load Balancer to handle the traffic. Ideally, the app should have been flying.
But it wasn't.
Users were complaining again. "The 'Trending Blogs' page takes forever to load!" they screamed in the reviews.
Akash looked at the metrics. The servers were fine. The network was fine. The bottleneck was the Database.
Chapter 1: The Slow Librarian (The Problem)
Every time a user opened the "Trending Blogs" page, the system did this:
- Fetch Data: It asked the MongoDB database for the top 10 blogs. (Time: 500ms)
- Calculate: The backend formatted the dates and sorted the list. (Time: 100ms)
- Response: The user finally got the page. (Total Time: 600ms)
This 600ms delay happened for every single user. If 10,000 people visited the page, the poor database had to answer the exact same question 10,000 times. It was like a librarian running to the basement to fetch the same book over and over again.
Akash realized: "Why am I calculating this every time? The trending blogs don't change every second!"
Chapter 2: The Cheat Sheet (Enter Caching)
Akash introduced a new layer to his architecture: The Cache.
He set up a Redis instance—a super-fast, in-memory storage. It acted like a "Cheat Sheet" or a sticky note on the librarian's desk.
The New Workflow:
- User A visits the page. The system checks the Cache. It's empty (Cache Miss).
- The system goes to the Database (500ms) + Calculates (100ms) = 600ms.
- Crucially, before sending the response, the system saves a copy of this final result in Redis.
The Magic:
- User B visits the page 1 second later.
- The system checks the Cache. It finds the data! (Cache Hit).
- It serves the data directly from RAM. Total Time: 20ms.
Akash had just reduced the load time by 96%. The database stopped sweating, and the users were happy.
Chapter 3: The Ghost of Old Data (Cache Invalidation)
Monday morning, Akash posted a breaking news blog: "50% OFF SALE!"
He refreshed the app. The new blog wasn't there.
He refreshed again. Still the old list.
The Problem: The Cache was doing its job too well. It was still serving the "Trending Blogs" list it saved yesterday. It didn't know a new blog existed.
Akash learned about Cache Invalidation. He had to tell the cache when to update. He implemented two strategies:
- The "Time to Live" (TTL): He told Redis: "Only keep this data for 24 hours."
- Day 1: Cache serves fast data.
- Day 2 (24 hrs later): Redis automatically deletes the data.
First Request: The system is forced to go back to the Database, get fresh data (including the new blog), and save it again.
Explicit Invalidation:
For critical things like "Product Price," 24 hours is too long. Akash wrote code so that whenever he updated a price in the admin panel, the system immediately nuked that specific key from the cache.
Chapter 4: The Layers of Speed (Types of Caching)
Akash realized he could cache things in more places than just the server. He built a "Defense in Depth" strategy for speed.
1. Client-Side Cache (The User's Pocket)
Akash realized the browser was downloading the logo.png and style.css every time a user refreshed.
- The Fix: He told the user's browser (via HTTP headers): "Keep this logo for a year. Don't ask me for it again."
- Result: Zero network requests. Instant load.
2. CDN Cache (The Delivery Trucks)
ShopStream had users in London, but the main server was in Mumbai. Light takes time to travel that distance.
- The Fix: Akash used a CDN (Content Delivery Network) like Cloudflare. He stored copies of his static files (images, videos) on servers all over the world.
- Result: A user in London downloaded the images from a London server, not Mumbai.
3. Server-Side Cache (The RAM)
This was his Redis setup.
- The Fix: Storing the results of heavy database queries or complex calculations in memory.
- Result: The database could relax.
4. Application-Level Cache
Inside the code, Akash had a complex function that calculated shipping costs based on weight and distance.
-
The Fix: He used a local variable (memoization) to store the result of
calculateShipping(10kg, 5km). If the code saw those inputs again, it just returned the saved number.
The Moral of the Story
By the end of the month, Akash looked at his dashboard.
- Performance: Latency dropped from 600ms to 60ms.
- Cost: He actually downgraded his database server because it had so little work to do, saving money.
- Scalability: When traffic spiked, the Cache absorbed the hits, protecting the fragile database.
Top comments (0)