DEV Community

Cover image for 8GB Is Enough Again (Apparently)
Niksa Kuzmanic
Niksa Kuzmanic

Posted on

8GB Is Enough Again (Apparently)

For years, the answer to "how much RAM do I need?" was always "more than you have." 4GB became a joke. 8GB became "the bare minimum." 16GB became the new baseline. 32GB started feeling reasonable for developers and gamers. The ceiling kept moving, and the industry was happy to sell you more every time it did.

Now, Apple has released the MacBook Neo with 8GB as the base configuration.

I've been watching the reactions online and it's a familiar split, half the people are outraged, half are saying "actually it's fine." What's interesting is that the "actually it's fine" crowd seems to be... right this time?

Apple's unified memory shares one pool between CPU and GPU, with bandwidth high enough that the system rarely has to page out aggressively. In practice that changes what 8GB actually feels like. People running Xcode, Safari with tabs open, and a local model at the same time report it holds up. On a regular machine with 8GB, that combination would be painful. Part of why it's painful is that nobody optimized for it.

Software stopped caring about memory. RAM was cheap, users had plenty, and optimizing for a tight memory budget takes real effort for a payoff that didn’t seem worthwhile. Electron apps shipped at 400MB idle. Browsers turned into memory sinkholes. Games padded their requirements because nobody had a reason to push back. The path of least resistance was "just tell people to upgrade."

That worked fine until memory got expensive again.

The jump from 16GB to 32GB costs more than most people want to spend, especially outside the US. Phone manufacturers are in the same spot: adding RAM to a mid-range Android is a real cost that hits margins directly. Suddenly the question "does this actually need more memory?" is worth asking, and most of the companies are asking it.

The situation got strange enough that late last year, Samsung Semiconductor reportedly turned down a DRAM order from Samsung Electronics, its own phone-making sibling. The memory division had better offers from AI data centers willing to pay more, so the Galaxy team had to renegotiate quarterly instead of locking in annual pricing. One arm of the same conglomerate wouldn't sell to the other because the margins were better elsewhere. That's how inflated the AI memory bubble got.

It’s also what the MacBook Neo situation quietly points at. Apple's unified memory architecture is a real technical difference, sure, but the more interesting thing is that their software runs well within that budget because somebody optimized for it. Instruments, Final Cut, even Safari have gotten noticeably leaner over the past few hardware cycles. Not because Apple is noble, but because their base config forces the constraint and the constraint forces the work.

What's odd about 2026 is that the advice is changing, and not for the reasons anyone predicted. RAM got expensive, AI ate the supply chain, and suddenly optimization is back on the table. And all of this comes down to three companies. Samsung, SK Hynix, and Micron that control most of the world's memory supply. When their priorities shift, everyone feels it, whether you're a phone maker, a Raspberry Pi hobbyist, or apparently, another division of Samsung.

We're not there yet across the board. But now, for the first time in a while, "how much memory does this actually need" is a question with some economic weight behind it. Software got fat during the cheap RAM years. It's going to take more than one MacBook SKU to fix that, but it seems like optimization is back in the game.

Top comments (0)