Big e‑commerce apps like Amazon, Flipkart, and Myntra handle millions of users daily. Their product listing pages load quickly, searches feel instant, and checkout flows rarely break—even on slower networks.
1. Progressive Data Fetching (Streaming + Partial Hydration)
Scenario:
You’re building a product listing page with filters, offers, banners, and 1000+ products. Showing a blank page until all data is ready will kill user experience.
Solution: Stream & hydrate parts of the UI in stages.
Code (Next.js 14 App Router):
export default function Page() {
return (
<>
{/* Critical categories stream first */}
<Categories />
{/* Products stream later */}
<Suspense fallback={<ProductSkeleton />}>
<Products />
</Suspense>
</>
);
}
Result: Users see categories and skeleton loaders in <1s, while products keep loading in the background.
Real‑world: Flipkart’s homepage shows banners & categories first, while product carousels stream later.
Prompt for ChatGPT:
"Show me a real-world example of Progressive Data Fetching using Next.js 14 with React Suspense and streaming. Include a product listing page where categories render first and products stream later."
2. Smart Caching at UI Level
Caching isn’t a single trick—it’s multiple layers working together.
a. In‑Memory Cache (Fastest)
Use React Query or SWR for automatic client‑side caching:
const { data: products } = useQuery('products', fetchProducts, {
staleTime: 5 * 60 * 1000, // data is "fresh" for 5 mins
cacheTime: 15 * 60 * 1000 // keep cache for 15 mins
});
Scenario: Flipkart's home page reloads instantly because it shows cached products while fetching fresh ones in the background (stale‑while‑revalidate).
b. Persistent Cache (Offline + Instant Reload)
Cache critical data in IndexedDB or LocalStorage using React Query’s persister:
import { persistQueryClient } from 'react-query/persistQueryClient-experimental';
import { createSyncStoragePersister } from 'react-query/createSyncStoragePersister-experimental';
const persister = createSyncStoragePersister({ storage: window.localStorage });
persistQueryClient({ queryClient, persister });
Result: Even after closing the tab, your app restores state in milliseconds.
c. API Caching with ETags or SWR
On the server, send proper headers:
Cache-Control: public, max-age=60
ETag: "hash"
Then fetch only when data changes:
fetch('/products', { headers: { 'If-None-Match': etag } });
Tip: Combine UI caching + CDN edge caching for best results.
Prompt for ChatGPT:
"Show me how to implement multi-layer caching in a React or Next.js e-commerce app:
In-memory cache using React Query or SWR,
Persistent cache with IndexedDB/LocalStorage, and
API caching using ETags and Cache-Control headers."
3. Debouncing & Throttling
Scenario:
Your search input is flooding the server with API calls: "i"
→ "ip"
→ "iph"
→ "iphone"
.
Debounce: Wait until user stops typing
import debounce from 'lodash.debounce';
const debouncedSearch = debounce((q) => fetchResults(q), 300);
<input onChange={(e) => debouncedSearch(e.target.value)} />
Throttle: Limit function calls (useful for scroll/resize events)
import throttle from 'lodash.throttle';
const handleScroll = throttle(() => {
console.log('Do something on scroll...');
}, 200);
useEffect(() => {
window.addEventListener('scroll', handleScroll);
return () => window.removeEventListener('scroll', handleScroll);
}, []);
Real‑world: Amazon’s search box fires exactly one request per user pause, not on every keystroke.
Prompt for ChatGPT:
"Show me examples of debounce (for a search bar) and throttle (for scroll or resize events) in React using lodash, with real-world e-commerce scenarios."
4. Intersection Observer for Infinite Scroll
Scenario:
You can’t load all 1000 products at once.
const observer = new IntersectionObserver((entries) => {
if (entries[0].isIntersecting) loadMoreProducts();
});
useEffect(() => {
if (loadMoreRef.current) observer.observe(loadMoreRef.current);
}, []);
Result: Products load only when the user scrolls down, reducing network load.
Prompt for ChatGPT:
"Show me how to use Intersection Observer in React to implement infinite scroll for a product grid with 1000+ items, loading the next batch only when the user scrolls down."
5. Real‑Time Updates (SSE/WebSockets)
Scenario:
A seller dashboard should update order statuses without refresh.
useEffect(() => {
const sse = new EventSource('/orders/live');
sse.onmessage = (e) => updateOrders(JSON.parse(e.data));
return () => sse.close();
}, []);
Tips:
- Use Zustand/Redux so real‑time updates propagate across components.
- Fallback to polling if SSE/WebSockets aren’t supported.
Real‑world: Flipkart Seller Hub updates live when buyers place orders.
Prompt for ChatGPT:
"Show me how to use Server-Sent Events (SSE) or WebSockets in a React seller dashboard to update order statuses in real time without refreshing the page."
6. Component Composition Over Hardcoding
Scenario:
The same ProductCard
is used in the buyer’s app (with “Add to Cart”) and in the seller’s app (with “Inventory”).
Composable Component:
<ProductCard product={p}>
<AddToCartButton /> {/* buyer */}
</ProductCard>
<ProductCard product={p}>
<InventoryBadge /> {/* seller */}
</ProductCard>
Why?
- Avoid role‑based
if
conditions everywhere - Components stay reusable across different apps
Prompt for ChatGPT:
"Show me how to design a reusable and composable ProductCard component in React that works differently for buyers (Add to Cart) and sellers (Inventory badge) without using role-based if conditions."
7. Bundle Optimization & CDN Strategy (Real Example)
Scenario:
Your checkout page JS is 1.5MB → it loads in 5s on 4G.
a. Code Split by Route
Don’t ship checkout code on the home page:
const CheckoutPage = dynamic(() => import('./CheckoutPage'));
b. Use CDN with Device‑Specific Assets
- Serve WebP images for browsers that support it
- Send different image sizes for mobile vs desktop
<picture>
<source srcSet="/images/product-400.webp" type="image/webp" />
<img src="/images/product-400.jpg" alt="Product" />
</picture>
Result: 60–70% bandwidth saved → faster FCP.
Prompt for ChatGPT:
"Show me how to reduce a 1.5MB checkout bundle size in a Next.js e-commerce app using code splitting, dynamic imports, optimized image formats (WebP/AVIF), and CDN caching for faster load times."
8. Micro‑Frontends (Break UI by Domain)
Scenario:
Checkout has 4 big modules: Cart, Address, Payment, Offers. One bug in Offers shouldn’t crash Cart.
<Route path="/checkout">
<CartModule /> {/* cart-mf.js */}
<PaymentModule /> {/* payment-mf.js */}
</Route>
Benefits:
- Teams deploy independently
- Smaller bundles → faster load
- Fewer merge conflicts
Real‑world: Amazon’s checkout flow is actually multiple micro‑apps stitched together.
Prompt for ChatGPT:
"Show me how to split a checkout flow into micro-frontends (Cart, Address, Payment, Offers) in a Next.js or React app using Module Federation or similar architecture."
Wrap‑Up
Each of these patterns directly solves a performance/scalability problem:
- Caching (in‑memory + persistent) → Faster reloads and offline UX
- Debounce + Throttle + IntersectionObserver → Control expensive events and load less data
- Micro‑frontends + bundle splitting → Smaller bundles, independent deploys
- Real‑time updates → Fresh data without refresh
Top comments (0)