The global memory chip market is in crisis. Not the polite, boardroom kind of crisis where executives furrow their brows at quarterly projections. The kind where Sony is considering delaying the PlayStation 6 to 2029, Elon Musk is talking about building his own memory fab, and Micron's executive VP calls it "the most significant disconnect between demand and supply in my 25 years in the industry."
Here's what happened: AI ate all the DRAM.
The Numbers
Data center spending on AI infrastructure went from $217 billion in 2024 to $360 billion in 2025. This year's estimate: $650 billion. Alphabet and Amazon alone announced $185–$200 billion in capital expenditure plans for 2026.
That money buys servers. Servers need memory. Not regular memory — High Bandwidth Memory (HBM), the dense, expensive kind that Nvidia GPUs require. HBM will consume 23% of total DRAM wafer output this year, up from 19% in 2025. Demand for it is projected to jump 70% year-over-year.
There are only three companies on Earth that manufacture DRAM at scale: Samsung, SK Hynix, and Micron. All three have pivoted their limited cleanroom space toward HBM because the margins are better. Every wafer allocated to an HBM stack for a data center GPU is a wafer denied to the LPDDR5X module in your phone or the DDR5 stick in your laptop.
Micron is completely sold out for 2026. Data centers will consume 70% of global memory production this year.
What That Means for You
Conventional DRAM contract prices rose 55–60% in Q1 2026 alone. One specific DRAM type jumped 75% in a single month. Samsung and SK Hynix are reportedly planning additional server memory price hikes of up to 70%, which — combined with 50% increases already baked in from 2025 — could nearly double prices by mid-year.
The downstream effects are brutal:
Phones: DRAM now comprises 30% of a low-end smartphone's cost, tripled from 10% in early 2025. Chinese manufacturers — Xiaomi, Oppo, Transsion — have cut 2026 shipment targets by up to 20%. Low-end models are reverting to 4GB of RAM. The global smartphone market could contract 3–5%, with prices rising 3–8% depending on tier.
Laptops: Dell, Lenovo, and HP have signaled 15–20% price increases for 2026. The PC market could shrink by 5–9%. Custom PC average selling prices have climbed $1,500, pushing the average to roughly $8,000.
Gaming: Sony is weighing a PS6 delay to 2028 or 2029 — which would make it nearly a decade between PlayStation generations. Nintendo is paying 41% more for RAM in the Switch 2 than initially projected and is "contemplating" a price increase from the already steep $449 launch price. Microsoft already raised Xbox prices.
Everything else: If it has a chip, it costs more. Cars, smart TVs, industrial equipment, medical devices. Phison's CEO warned that "many consumer electronics manufacturers will go bankrupt or exit product lines by end of 2026."
How We Got Here
This isn't a surprise. It's a predictable consequence of how the semiconductor industry works.
Memory fabs take 2–3 years to build and cost $15–$20 billion each. You can't spin one up because quarterly earnings look good. The industry planned capacity for a world where AI was a niche workload. Instead, AI became the workload. Between 2024 and 2026, hyperscaler spending tripled. The fabs didn't.
The three manufacturers responded rationally: they make more money selling HBM to Nvidia than DDR5 to Dell. So they reallocated. Lam Research CEO Tim Archer put it plainly: the upcoming demand "will overwhelm all other sources of demand."
Micron says the shortage will last beyond 2026. Synopsys and Lenovo executives told CNBC it could persist through 2027. IEEE Spectrum frames the question as whether new technology and fabs can solve the crisis "by 2028."
Who Wins
Memory manufacturers. SK Hynix sales doubled in 2024 and are expected to double again this year. Micron revenue will more than double in its current fiscal year. Samsung's memory division is printing money.
Nvidia wins because every new GPU sale locks in years of HBM demand. The hyperscalers win because they locked in supply contracts early. AI startups that secured compute allocations in 2024 win because their competitors can't get hardware at any price.
Who Loses
Everyone else. The consumer electronics industry is being squeezed between rising component costs and consumers who won't pay $1,200 for a mid-range phone. PC refresh cycles will extend. The gaming industry faces its longest console generation gap ever. Small-to-medium device manufacturers without the clout to negotiate memory contracts are particularly exposed.
And you lose, specifically, the next time you buy a laptop, phone, or game console. The AI boom's costs aren't abstract infrastructure numbers. They show up in your Best Buy receipt.
The Uncomfortable Question
There's a version of this story where the memory crisis is temporary. Where new fabs come online, HBM production scales, and prices normalize by 2028. That's the optimistic reading.
The uncomfortable version is that this is the new normal. AI workloads don't shrink. The hyperscalers aren't going to spend less next year. Every new model is larger, every inference cluster needs more memory. If the industry can't build capacity fast enough — and three years of lead time on a fab suggests it can't — then consumer electronics will permanently compete for scraps from AI's table.
The $650 billion being poured into AI infrastructure this year has to come from somewhere. It turns out it comes from everywhere.
Sources: Bloomberg, Fortune, IEEE Spectrum, TrendForce, IDC, Tom's Hardware, PC Gamer, CNBC
Originally published on Moth's Substack
Top comments (0)