loading...
Cover image for 20 Fantastically Bad Predictions Made About Computing and Technology

20 Fantastically Bad Predictions Made About Computing and Technology

awwsmm profile image Andrew (he/him) ・8 min read

Header photo by mali maeder from Pexels.


"I have but one lamp by which my feet are guided, and that is the lamp of experience. I know of no way of judging of the future but by the past."

-- Patrick Henry, Colonial American Orator and Politician


Frequentist inference uses the past as a simple predictor for the future. If the Sun has risen in the east for 1.6 trillion out of the past 1.6 trillion days, then it's very likely it will do so again tomorrow. When statistical trends move slowly with little noise (say, the number of minutes of daylight througout the year) or not at all (the area of the sky in which the Sun rises), it's very easy to make a very accurate prediction of where those trends will go in the future. But when things change quickly, especially when new and unexpected events occur, it can be difficult -- if not impossible -- to make any decent prediction at all.

Frequentist statistics is often contrasted with Bayesian statistics, which offers a more nuanced prediction, based on prior knowledge about the conditions surrounding the event to be predicted. You can read a great introduction to Bayesian statistics here.

This is why predictions about the future of humanity over the past ~300 years or so have typically been very poor. The global population has been increasing exponentially since the turn of the 19th century; there are nearly 10 times as many people alive today as there were in 1800. The industrial and scientific revolutions of the latter half of the last millennium led to inventions that would have been unthinkable to the average person in the Middle Ages; telecommunications, air and space travel, modern medicine -- all of these would seem incredible, if not supernatural to people less than 20 generations ago.

Computing has gone through a similar revolution over the past ~80 years. Inventions like the transistor, the integrated circuit, and the microprocessor have accelerated the advancement of computing hardware to a breakneck pace. Moore's Law predicts that computing power will double every 18-24 months. This law has held true since at least the 1970s, and will likely continue with advancements in quantum computing, highly multi-core processors, graphene batteries, and more. The average smartphone today can wield more computing power than all of the combined computational resources of NASA in 1969. Imagine where we'll be in another 50 years.

This is why predictions about computing are notoriously bad, and why they'll likely continue to be. We simply have no idea what is possible in the future when every 10 years we have more than 10 times as much power at our disposal. Our brains are wired to expect that the future will just be basically the same as the present, and reality often proves us widly inept at forecasting.

"If I had asked people what they wanted, they would have said faster horses."

-- Henry Ford, famous horse hater

Here are the 20 most impressively terrible predictions I could find about computing and technology in general (in no particular order, emphasis mine):


#1: 1864

trains have no future

"No one will pay good money to get from Berlin to Potsdam in one hour when he can ride his horse there in one day for free."

-- King William I of Prussia, famous train hater

#2: 1876

the telephone has no future

"This 'telephone' has too many shortcomings to be seriously considered as a means of communication."

-- Western Union internal memo

#3: 1898

x-rays are a hoax (?)

"Don't waste time on foolish ideas. Radio has no future, X-rays are clearly a hoax, and the aeroplane is scientifically impossible."

-- Lord Kelvin, President of the British Royal Society

#4: 1903

we'll have airplanes in... 10 million years

"Hence, if it requires, say, a thousand years to fit for easy flight a bird which started with rudimentary wings, or ten thousand for one which started with no wings at all and had to sprout them ab initio, it might be assumed that the flying machine which will really fly might be evolved by the combined and continuous efforts of mathematicians and mechanicians in from one million to ten million years..."

-- The New York Times, October 1903, just two months before the Wright brothers successfully flew their first plane at Kitty Hawk, North Carolina, USA, on 17 December 1903

#5: 1927

talkies are for nitwits, ya see?

"Who the hell wants to hear actors talk?"

-- Harry M. Warner, President of Warner Brothers Pictures

#6: 1946

we only need one computer per country

"...it is very possible that ... one machine would suffice to solve all the problems that are demanded of it from the whole country."

-- Charles Darwin, grandson of the famous naturalist, predicting a global market for, oh, about 100 computers

#7: 1949

computers are chonky

"Where a calculator like ENIAC today is equipped with 18,000 vacuum tubes and weighs 30 tons, computers in the future may have only 1000 vacuum tubes and perhaps weigh only 1Β½ tons."

-- Popular Mechanics, March 1949 issue

It's true. My laptop has almost no vacuum tubes at all.

#8: 1955

???

"Nuclear-powered vacuum cleaners will probably be a reality in 10 years."

-- Alex Lewyt, President of a (non-nuclear) vacuum manufacturer

#9: 1957

"I have traveled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won't last out the year."

-- Editor, Prentice Hall

#10: 1961

"There is practically no chance communications space satellites will be used to provide better telephone, telegraph, television or radio service inside the United States."

-- T. A. M. Craven, Commissioner of the Federal Communications Commission (FCC)

#11: 1966

"Remote shopping, while entirely feasible, will flop."

-- TIME Magazine

TIME Magazine is now owned by billionaire Marc Benioff, whose books can be found for sale on the famously remote retailer, Amazon.com.

#12: 1968

predating Edwin Starr by 2 years

"But what... is it good for?"

-- An engineer at the Advanced Computing Systems
Division of IBM
, commenting on the microchip

#13: 1977

...absolutely nothing

"There is no reason for any individual to have a computer in his home."

-- Ken Olsen, referring to home automation

#14: 1981

i.e. 0.05% of what Slack uses

"640k ought to be enough [memory] for anybody."

-- Bill Gates, Founder of Microsoft (apocryphally)

#15: 1992

"...the idea of a wireless personal communicator in every pocket is 'a pipe dream driven by greed'."

-- Andrew Grove, CEO of Intel, quoted in The New York Times

#16: 1995

"The truth is no online database will replace your daily newspaper, no CD-ROM can take the place of a competent teacher and no computer network will change the way government works."

-- Clifford Stoll, in Newsweek

#17: 1996

aren't we all, though

"...Apple [is] a chaotic mess without a strategic vision and certainly no future."

-- TIME Magazine, February 1996 issue

#18: 1998

"The growth of the Internet will slow drastically, as the flaw in 'Metcalfe's law' -- which states that the number of potential connections in a network is proportional to the square of the number of participants -- becomes apparent: most people have nothing to say to each other! By 2005 or so, it will become clear that the Internet's impact on the economy has been no greater than the fax machine's."

-- Paul Krugman, Professor Emeritus at Princeton University

#19: 2006

"Everyone's always asking me when Apple will come out with a cellphone. My answer is, 'Probably never.' ...It just ain't gonna happen."

-- David Pogue, tech columnist for The New York Times, writing less than 9 months before the first iPhone was released

#20: 2008

"Let's look at the facts. Nobody uses those things."

-- Steve Ballmer, CEO of Microsoft, on apps, predicting the failure of the iPhone

In 2007, Ballmer was also quoted as saying that the iPhone was unappealing because "it doesn't have a keyboard, which makes it not a very good email machine".



What predictions do you think we'll be laughing at in 10 or 20 years? Flying cars? Something about Bitcoin? Or the colonisation of Mars? Let me know in the comments!

If you enjoyed the above article, maybe you'd like to follow my work on Dev.To? Or look at the dog photos I retweet on Twitter? Or buy me a cup of coffee?

Whatever you do, thanks for stopping by!

Discussion

pic
Editor guide
Collapse
jmfayard profile image
Jean-Michel Fayard πŸ‡«πŸ‡·πŸ‡©πŸ‡ͺπŸ‡¬πŸ‡§πŸ‡ͺπŸ‡ΈπŸ‡¨πŸ‡΄

All 20 quotes are good.

Though I would say they don't really give you a full picture of what happened.

In your storytelling it seems that technology always win! If that's the case, predicting the future would be really easy indeed.

But here is techno-skeptical prediction that has hold out really well in the last 30 years

No Silver Bullet

There is no single development, in either technology or management
technique, which by itself promises even one order-of-magnitude
improvement within a decade in productivity, in reliability, in simplicity.
β€” Essence and Accident in Software Engineering

Frederick P. Brooks, Jr.
University of North Carolina at Chapel Hill

So what about another list of technologics claims that flopped hard?

I think you would find a lot to say as well.

  • we will have flying cars in year 2000!
  • internet will bring democracy in China!
  • XML will solve all interopability issues!
  • Rewriting Netscape from scratch will work out really well!
  • Theranos will be a new area of medicine!
  • Blockchain will disrupt entire industries and reinvent democracy!
  • WeWork will be a 100 billion dollars tech company in real estate!
  • ...

Please complete the list πŸ˜€

Collapse
codemouse92 profile image
Jason C. McDonald

Rewriting Netscape from scratch will work out really well!

It's called Firefox.

Collapse
jmfayard profile image
Jean-Michel Fayard πŸ‡«πŸ‡·πŸ‡©πŸ‡ͺπŸ‡¬πŸ‡§πŸ‡ͺπŸ‡ΈπŸ‡¨πŸ‡΄

Nope.

The rewrite of Netscape was called Mozilla and was a disaster as explained in this classical article

Things You Should Never Do : rewrite your software from scratch (Joel on Software)

Firefox came later and was the simplification of Mozilla.

Thread Thread
codemouse92 profile image
Jason C. McDonald

So Firefox was the rewrite of Mozilla, the rewrite of Netscape. Meaning, by a separation of one iteration, Firefox is a rewrite-from-scratch of Netscape. And it turned out well. ;)

Thread Thread
thorstenhirsch profile image
Thorsten Hirsch

Jean-Michel is right - it was a disaster for Netscape. It was the cause for the company's bankruptcy (that's what their ex-employees say themselves!). Mozilla/Firefox source code was given away for free when the company was practically gone. And the software was way behind Internet Explorer in those days.

It took developers 2 or 3 years to make Mozilla/Firefox a decent product and it took several more years until one could say that it was a success. Netscape was long gone by then.

Collapse
codemouse92 profile image
Jason C. McDonald

Hilarious!

Just one thought...

"640k ought to be enough [memory] for anybody."

Maybe it ought to be? It's actually appalling how much memory (not to mention processing power) is wasted in modern programs, purely from lazy coding practices and unnecessary UI "chrome" and effects.

Collapse
jgierer12 profile image
Jonas Gierer

We can afford that though, since memory has become so cheap that even a few MB more or less don't really matter to anyone anymore.

My prediction (which I know may end up in a similar lists in 20 years): RAM prices and SSD speeds will continue to fall/rise resp. so much that we can just put a couple of TB flash memory in our computer which serves both as persistent storage and working memory. This would open up some interesting new possibilities for software too, basically eliminating all storage-related loading times (software and OS startup, loading screens in games etc.)

Collapse
ssimontis profile image
Scott Simontis

Look up Intel Optane drives...basically Enterprise grade SSD drives that are fast enough to work as a RAM cache (to my very limited understanding). I have one in my room I've been meaning to play with but lack the proper adapter to connect it to one of my rack mount servers.

EDIT: Fixed product name

Collapse
elmuerte profile image
Michiel Hendriks

We can afford that though, since memory has become so cheap that even a few MB more or less don't really matter to anyone anymore.

You forget that managing memory costs CPU time.

High memory usage is often paired with high number of dynamic (de)allocation. This causes memory fragmentation. De-fragmenting memory costs even more effort.

High dynamic (de)allocation is not bad. Or at least, not in the Java world. Just do not keep the data around for a long time.

Collapse
codemouse92 profile image
Jason C. McDonald

That'd be nice, but I'll counter that it probably won't work out that way. As always, our lazy coding habits and unnecessary bells-and-whistles will take up all the available memory, even if there are terabytes of RAM available. Consider that a single tab in a web browser now takes up more memory than was available for the entire Apollo mission.

Also, I never trust flash/SSD for primary persistent storage. You can't recover data from it when it fails. This is why to this day, HDDs are still often used for persistent data where recovery is a necessary possibility; the recoverability is a side-effect of the physical characteristics.

Collapse
awwsmm profile image
Andrew (he/him) Author

Unfortunately, we'll always be limited -- to some extent -- by memory hierarchies and physical distances in the machine:

electronics.stackexchange.com/a/82...

Thread Thread
codemouse92 profile image
Jason C. McDonald

Indeed. There's a hard physical limit, at least until someone cracks the code for making a consumer-friendly system that stores at the atomic level...and even that has its limits.

It makes me value even more the memory we have. The average computer has more memory and CPU power than the supercomputers of the 80s. Wasting it on poor coding practice and unnecessary graphical fireworks is such a shame! We could be funneling all that wasted memory into more useful things, making our computers do far more than they do now, and far more efficiently.

Thread Thread
awwsmm profile image
Collapse
awwsmm profile image
Andrew (he/him) Author

I need the shiny, Jason. I need it.

Collapse
ben profile image
Ben Halpern

Very fun.

we only need one computer per country

Seems like the Cloud companies would all love to make this true in their favor 😬

Collapse
jmfayard profile image
Collapse
maurogarcia_19 profile image
Mauro Garcia

I was thinking something really similar πŸ˜„

Collapse
elmuerte profile image
Michiel Hendriks

#2 have become true. Who uses telephones anymore.

#17 was true. At that time Apple was quite worthless as a company.

Collapse
thorstenhirsch profile image
Thorsten Hirsch

Indeed - it would not be fair to laugh at #17, because Apple was a completely different company before the return of Jobs. You can hate him for many things, but you can't argue away that it was his sole achievement to turn around the fate of Apple when nobody believed in the company anymore. Who could have anticipated that?

Collapse
elmuerte profile image
Michiel Hendriks

To be fair, the Apple Jobs "saved" was the Apple Jobs helped create. Jobs needed this "time out" to get this priorities straight again. Although NeXT was still no success, Jobs did figure out again than you also need to ship products. That you cannot wait until you have the perfect thing.
Apple failed more when axing Jobs by also throwing out the culture that made Apple back in the day. The board and CEOs only wanted to play safe and not invest in new technology.
To me it feels like Apple has been returning to the mid 90s version of itself. The main difference is they are now huge.

Collapse
dansilcox profile image
Dan Silcox

Very entertaining πŸ˜‚ just shows how little we actually know, even those who are very smart by educational standards... I for one can’t wait to get a nuclear powered hoover πŸ˜‚