A reproducible measurement of the New York Times front page, taken on an ordinary day in 2026, recorded the browser issuing more than four hundred distinct network requests and pulling down more than fifty megabytes of data over the course of roughly two minutes before the page came to rest. The measurement was done with the browser's own developer tools. Anyone with a laptop and a network tab can run it again, on any popular news homepage, and the number that comes back will not be small.
That is the version of the web we have ended up with. Fifty megabytes is more than twice the install media for the operating system that ran the desktop computers of the late 1990s. Windows 95 retail shipped on thirteen high-density "DMF" floppy disks of 1.68 MB each, about twenty-two megabytes total, with maybe thirty more after a typical install fanned out on the hard drive. One newspaper homepage now arrives carrying more than twice the binary that used to ship the operating system and the bundled productivity tools and the games and a copy of WordPad.
I want to take that measurement seriously rather than dismiss it as a worst case. Worst cases tell you something. They tell you what an industry will tolerate carrying around when nobody is enforcing a budget.
Where fifty-plus megabytes sits in the distribution
The 50-plus MB number is not the median. The median is much smaller, and the gap between the two is part of the point.
HTTP Archive's Web Almanac for 2025 puts the median page at 2.56 MB on mobile and 2.86 MB on desktop, measured across millions of sites. That is itself a number worth pausing on. The same project pegged the 2015 mobile median at 845 KB. A decade of broadband, fibre, 5G, and HTTP/3 has bought the median page roughly a 200% increase in transfer size, which means the median page is now carrying about three times the weight it did when the industry started worrying out loud about page weight.
The 50-plus-MB measurement is more than an order of magnitude past that median, into the tier of the distribution where flagship publishers live and where the front page of a national daily competes against itself for screen real estate and CPU and battery before anyone has read a paragraph. The interesting question isn't why the median has crept up. The interesting question is why the worst case has run away from the median, and what specifically lives in those forty-eight extra megabytes that the median page can do without.
The decomposition is consistent across measurement runs. HTML is a sliver. The article text, the part the reader came for, is usually under fifty kilobytes — about the weight of a chapter of a paperback book. JavaScript dominates, and most of it is third-party. HTTP Archive's third-party report finds that around 93% of all sites load at least one tracker, and that scripts account for roughly a third of all third-party requests. On a flagship news page that "third of third-party requests" expands to several hundred individual fetches, all happening while the article is also trying to render.
The auction running in your browser
What those four hundred-odd network requests are doing, mostly, is running an advertising auction.
The technical name for the architecture is header bidding. The most common open-source implementation is Prebid.js, a JavaScript library that orchestrates a real-time ad auction inside the user's browser. When the page begins to load, Prebid (or a vendor wrapper around it) fires off parallel bid requests to a list of supply-side platforms — Rubicon, OpenX, Magnite, Index Exchange, Amazon's ad system, and roughly twenty more depending on the publisher's setup. Each platform responds with a price. The wrapper picks a winner. The winning ad is then loaded, which often triggers its own cascade of pixel calls, viewability beacons, and identity-graph lookups. The auction overlaps with the article render. It is the article render's biggest competitor for CPU.
There is also, separately, the consent-management layer. European publishers (and any global publisher with European traffic) have to integrate the IAB's Transparency and Consent Framework, a cross-industry standard whose v2.3 release in June 2025 set a February 2026 deadline for adoption. The TCF emits a long opaque consent string into every bid request. Without a valid consent string, premium bidders refuse to bid. With one, they bid. The consent banner the reader is asked to dismiss is the user-facing tip of an iceberg whose underwater bulk is technical infrastructure for a real-time auction.
Maciej Cegłowski gave a talk at Web Directions Sydney on 29 October 2015 called "The Website Obesity Crisis," in which he held up a Yorkshire Evening Post article whose JavaScript alone outweighed Remembrance of Things Past. The talk is now ten years old. Almost everything in it has gotten worse. The fifty-megabyte front page is the natural extrapolation of the curve Cegłowski plotted, drawn out another decade against the same incentive gradient.
The architecture is rational. The aggregate is not.
It would be a mistake to read a fifty-megabyte front page as the work of a careless engineering team. The publisher's engineers are not careless. Most of them know exactly what is going on. They are, in the local view, optimising rationally.
The metrics that pay are CPM (cost per thousand impressions) and viewability — the fraction of an ad slot that was actually seen, for the threshold time the buyer required, by an actual human. Both metrics increase with time-on-page. Time-on-page increases with friction: cookie banners that take a moment to dismiss, newsletter modals that ask for an email, sticky video players that refuse to leave when the reader scrolls past, "continue reading" buttons that segment the article into ad-loadable halves. Each of those decisions is locally optimal for a publisher trying to extract a fraction of a cent more from each visit. Aggregated across an industry, the same decisions produce a user experience that has been described in Nielsen Norman Group's design literature as a violation of basic interaction-cost principles, and specifically of Heuristic 8 of Jakob Nielsen's classic ten: "Interfaces should not contain information that is irrelevant or rarely needed".
The reader is not the publisher's customer. The reader is the publisher's inventory. The customer is the advertiser. This isn't a moral judgement; it's a description of the cash flow. Once you internalise that, every individually puzzling design decision becomes legible. The "X" button that's hard to hit on a phone (a violation of Fitts's law, which since 1954 has predicted that small targets at long distances take disproportionately longer to acquire) is hard to hit because every accidental tap on the ad is more profitable than a closed banner. The text that jumps two hundred pixels down the page after eight seconds is jumping because an ad slot just resolved its bid and inserted itself above the article. This last behaviour shows up in Google's Core Web Vitals as Cumulative Layout Shift, a metric Google explicitly penalises in search rankings: a CLS score above 0.25 is rated "poor."
Google's awkward dual role
The cleanest illustration of the incentive trap is that Google is on both sides of it.
The search arm has spent nearly a decade pushing publishers toward a calmer page. In August 2016 Google announced a forthcoming penalty for "intrusive interstitials" on mobile, which rolled out on January 10, 2017, demoting pages whose content was obscured by pop-ups on first arrival. CLS was added as a Core Web Vitals signal a few years later, with explicit ranking implications. The Search Central documentation reads, even now, like a polite plea on the user's behalf.
The advertising arm sells the auction software, the ad exchange, the demand-side platform, the consent-management code path, and a substantial fraction of the pixels that fire when a page loads. The same company that penalises layout shift in search runs one of the largest demand-side platforms in the bidding market that produces the layout shift. This isn't hypocrisy in the cartoon sense. It's a structural feature of an industry where the largest player has a hand in every layer of the stack and lets each layer optimise against the others.
The lean web is still there. It just isn't the default.
A useful test of whether a constraint is technical or economic is to ask whether anyone has solved the technical version. In this case the answer is yes, and the solution has been quietly running for twenty years.
NPR's text-only site launched in June 2005, originally as thin.npr.org, in response to the September 11 attacks and the spike in mobile readers on early handheld devices. It is still running. CNN runs lite.cnn.com, which serves the most recent hundred stories as semantic HTML with no scripts and no third parties. The CBC runs cbc.ca/lite along similar lines. RSS readers — a category presumed dead by every "web is over" essay since 2010 — continue to ship updates daily and carry millions of readers past the auction layer.
The lean versions exist because the same publishers can build them when they want to. The fifty-megabyte version is not a consequence of the technology. It is a consequence of the contract. Strip out the contract, and the lean version is what's left. The article fits in fifty kilobytes. It always did.
What the measurement actually shows
Fifty megabytes is not a verdict on web technology. The web is, in raw capability, faster and more capable than it has ever been. It is a measurement of what an industry will tolerate carrying around for the sake of a fraction of a cent per impression, when nobody is enforcing a budget on its behalf.
The reader is, individually, almost powerless against this. The reader is also, collectively, the entire reason the system runs at all. The text-only sites get built when enough readers ask for them. The RSS reader market keeps existing because enough readers refuse to read inside the auction. Each tab closed in frustration registers somewhere as a lower time-on-page. Each subscription cancelled because the front page is unreadable on a phone registers as a higher churn rate.
The number to watch is not the megabytes. The number to watch is which version the publisher links from the social-share button. The day a flagship news organisation makes its lean URL the canonical one is the day the industry has decided the contract was always negotiable. That day is not here. The lean URL still exists, though, and reading on it is a vote.
Top comments (0)