Quran.com is one of the most-visited Islamic websites in the world. During my time there as a Full Stack Engineer, the platform crossed 50 million monthly active users — a milestone that forced us to rethink nearly every assumption we'd made about architecture, data, and performance.
This post covers the technical decisions that got us there, the tradeoffs we made, and what I'd do differently.
The Scale Reality
50M MAU sounds like an abstract number. In practice it means spikes of hundreds of thousands of concurrent users during Ramadan, Quran audio files being streamed from every continent simultaneously, prayer time calculations for locations across the entire globe, and serving both high-bandwidth users in the West and users on 2G/3G connections in South Asia and Sub-Saharan Africa.
That last point shaped almost every technical decision. You can't optimize purely for fast connections. You have to think about the user opening the site on a 2G connection in rural Pakistan at 3am for Fajr prayer.
Frontend: Next.js as the Foundation
We built on Next.js. Server-side rendering was critical for two reasons: SEO (the Quran text needs to be indexable) and first-load performance on slow connections. Each Surah page (114 in total) is statically generated at build time — meaning the first HTML response is near-instant, no waiting for JS to hydrate before the user sees text.
We were aggressive about code splitting: audio player logic only loads when the user actually interacts with audio. Next.js Image with AVIF/WebP fallback cut image payloads significantly while staying compatible with older devices.
Offline Audio Streaming
One of the most technically interesting problems: how do you let a user listen to Quran recitations with no internet?
Quran audio is broken into individual ayah (verse) recordings — up to 6,236 of them per reciter. You can't cache all of that upfront. Our solution: when a user starts playing a Surah, we silently prefetch the next 10 ayahs in the background using Service Workers and the Cache API. If connectivity drops, playback continues seamlessly. We also gave users an explicit "Download for offline" option for Surahs they listen to regularly.
Geospatial: Prayer Times at Scale
Prayer times are calculated based on GPS coordinates — simple until you consider DST rules that differ by country, multiple calculation methodologies (Hanafi, Shafi'i, MWL, ISNA), and users who want accurate times regardless of whether they share their location.
We moved prayer time calculation to the client side. The astronomy math runs in under a millisecond on any modern device, requires no server round-trip, and stores no user location data.
For map-based features — nearby mosques, Qibla direction — we used Mapbox GL JS. The vector tile approach is ideal: compact, cacheable, and sharp at any zoom level.
Database and Infrastructure
PostgreSQL was our primary data store. The Quran text is relational in nature — verses have translations, tafsirs, word-by-word breakdowns, audio timestamps, and cross-references. A relational model maps cleanly to this structure.
For our read-heavy endpoints (almost all of them), we leaned on read replicas to distribute query load, Redis for caching frequently-accessed data, and aggressive CDN caching at the edge for static API responses.
Infrastructure ran on AWS: ECS for containerized services, RDS for PostgreSQL, ElastiCache for Redis, CloudFront as CDN.
What I'd Do Differently
Invest in observability earlier. We added distributed tracing later than we should have. When you're debugging a latency spike that only affects users in a specific region, you want granular trace data — not logs.
Be more aggressive with edge caching. Many API responses could have been fully cached at the CDN layer, eliminating server load for the vast majority of requests. We were conservative about this early and paid the price during Ramadan traffic spikes.
Design for low-bandwidth from day one. We retrofitted a lot of optimizations later. Starting with that constraint leads to better decisions upfront — smaller bundles, progressive loading, offline-first thinking baked in from the beginning.
Closing Thoughts
Working on a platform where the content matters deeply to hundreds of millions of people sharpens your instincts. Performance isn't vanity. A 3-second improvement in load time is the difference between someone completing Fajr prayer with the app or giving up.
If you're working on similar challenges — high-scale web apps, geospatial features, or offline-capable PWAs — find me at zunain.com or on GitHub at mzulqarnain118.
Top comments (0)