DEV Community

Discussion on: 20 Fantastically Bad Predictions Made About Computing and Technology

Collapse
 
codemouse92 profile image
Jason C. McDonald

Hilarious!

Just one thought...

"640k ought to be enough [memory] for anybody."

Maybe it ought to be? It's actually appalling how much memory (not to mention processing power) is wasted in modern programs, purely from lazy coding practices and unnecessary UI "chrome" and effects.

Collapse
 
awwsmm profile image
Andrew (he/him)

I need the shiny, Jason. I need it.

Collapse
 
jgierer12 profile image
Jonas Gierer • Edited

We can afford that though, since memory has become so cheap that even a few MB more or less don't really matter to anyone anymore.

My prediction (which I know may end up in a similar lists in 20 years): RAM prices and SSD speeds will continue to fall/rise resp. so much that we can just put a couple of TB flash memory in our computer which serves both as persistent storage and working memory. This would open up some interesting new possibilities for software too, basically eliminating all storage-related loading times (software and OS startup, loading screens in games etc.)

Collapse
 
ssimontis profile image
Scott Simontis • Edited

Look up Intel Optane drives...basically Enterprise grade SSD drives that are fast enough to work as a RAM cache (to my very limited understanding). I have one in my room I've been meaning to play with but lack the proper adapter to connect it to one of my rack mount servers.

EDIT: Fixed product name

Collapse
 
elmuerte profile image
Michiel Hendriks

We can afford that though, since memory has become so cheap that even a few MB more or less don't really matter to anyone anymore.

You forget that managing memory costs CPU time.

High memory usage is often paired with high number of dynamic (de)allocation. This causes memory fragmentation. De-fragmenting memory costs even more effort.

High dynamic (de)allocation is not bad. Or at least, not in the Java world. Just do not keep the data around for a long time.

Collapse
 
awwsmm profile image
Andrew (he/him)

Unfortunately, we'll always be limited -- to some extent -- by memory hierarchies and physical distances in the machine:

electronics.stackexchange.com/a/82...

Thread Thread
 
codemouse92 profile image
Jason C. McDonald

Indeed. There's a hard physical limit, at least until someone cracks the code for making a consumer-friendly system that stores at the atomic level...and even that has its limits.

It makes me value even more the memory we have. The average computer has more memory and CPU power than the supercomputers of the 80s. Wasting it on poor coding practice and unnecessary graphical fireworks is such a shame! We could be funneling all that wasted memory into more useful things, making our computers do far more than they do now, and far more efficiently.

Thread Thread
 
awwsmm profile image
Andrew (he/him)

Related: qr.ae/TWo1hM

Collapse
 
codemouse92 profile image
Jason C. McDonald

That'd be nice, but I'll counter that it probably won't work out that way. As always, our lazy coding habits and unnecessary bells-and-whistles will take up all the available memory, even if there are terabytes of RAM available. Consider that a single tab in a web browser now takes up more memory than was available for the entire Apollo mission.

Also, I never trust flash/SSD for primary persistent storage. You can't recover data from it when it fails. This is why to this day, HDDs are still often used for persistent data where recovery is a necessary possibility; the recoverability is a side-effect of the physical characteristics.