How We Scaled Quran.com to 50M Monthly Users: Architecture Lessons From the Inside
Scaling a web application to serve 50 million monthly users is not just a technical challenge — it is an architectural one. At Quran.com, one of the world's most visited Islamic resources, I led the frontend engineering effort that took the platform from struggling under load to reliably serving tens of millions of users globally.
Here's what I learned.
Start With the Right Foundation: Next.js and ISR
When I joined the team, Quran.com was running a single-page React app with client-side rendering. Every page load hit the server, every user got the same slow first paint. The fix was a full migration to Next.js with Incremental Static Regeneration (ISR).
ISR lets you pre-render pages at build time and revalidate them in the background on a schedule. For content like Surah pages — which rarely change — we set revalidation to 3600 seconds (1 hour). This meant most users hit pre-rendered HTML instantly instead of waiting for server execution.
Result: Time to first byte dropped by over 70% for users in Asia and the Middle East, where our traffic is heaviest.
Edge Caching: The Second Layer
ISR alone was not enough. We layered Cloudflare in front of Vercel, configured to cache ISR-generated pages at the CDN edge. Users in Riyadh, Karachi, Jakarta — wherever they are — now get pages served from a node close to them.
The configuration that made this work:
// next.config.js
module.exports = {
async headers() {
return [
{
source: '/surah/:id*',
headers: [
{
key: 'Cache-Control',
value: 'public, s-maxage=3600, stale-while-revalidate=86400'
}
]
}
]
}
}
The stale-while-revalidate header is the key. It tells the CDN: serve the cached version immediately, and revalidate in the background. Users never wait.
Result: CDN cache hit rate went from ~40% to 92%.
Code Splitting and Bundle Optimization
A 50M-user app cannot afford bloated JavaScript bundles. We audited every dependency and applied aggressive code splitting:
- Dynamic imports for heavy components (audio player, translation viewer)
- Route-based splitting via Next.js dynamic routing
- Moving user preferences (theme, language settings) to client-side state to avoid cache busting on every personalized request
Result: Initial JS bundle size reduced by ~45%.
What Scaling Actually Teaches You
The biggest lesson from working at this scale: the architecture decisions that seem like premature optimization are often the decisions that matter most.
ISR is not a performance trick — it is a scalability strategy. At 50 million users, every millisecond you shave off the critical path compounds. A 100ms improvement in TTFB means millions of users per day load faster, engage more, and return more often.
If you are building a content-heavy Next.js application, do not wait until you have scale problems to think about ISR and edge caching. Build it right the first time.
The Stack at a Glance
- Framework: Next.js 14 (App Router, ISR)
- CDN: Cloudflare
- Deployment: Vercel
- State: React Query + Zustand
- Languages: TypeScript throughout
If you have questions about scaling Next.js in production or want to talk architecture, reach out at zunain.com.
Top comments (0)