DEV Community

Joshua Ballanco
Joshua Ballanco

Posted on • Originally published at manhattanmetric.com

The Singularity is Coming

It is not uncommon for popular culture to pick up on a concept from science and twist it until it is nearly unrecognizable. The concept of the "technological singularity" is no exception. Still, when the singularity is talked about as "the machines taking over" or the point at which we all "upload our consciousness to the cloud", we have strayed so far from the term's original meaning that it is worth revisiting what the term was meant to convey.

Since humans first picked up two stones and hit them against each other to create a tool, we have worked to advance technology. An interesting thing about technology, though, is that each advance in technology typically relies on those that came before it. A blacksmith can craft all manner of useful implements, but only because they have a hammer, anvil, forge and bellows which had to have been created by someone else using technology.

But it's not just the case that each technological advance relies on those that came before it. Rather, it seems that the time needed to arrive at each new technological advance is shortened by earlier advances and that, in turn, each new advance further shortens the time to the next big leap. Charles Babbage designed what is often considered to be the first computer in 1837. Unfortunately the technology available to him at the time was not sufficient to build it. Following advances in generating and harnessing electricity, the first fully electric computers were developed in the 1940s, but they remained bulky and slow.

Once these first computers were built, however, their promise sped up the advancement of transistor technology which allowed computers to be shrunk to the size of a couple of refrigerators. Computers running on transistors then became vital in the development of the integrated circuit, which further allowed computers to shrink to the size of a desktop appliance, something you could put in your briefcase, or, eventually, something you could wear on your wrist.

Of course, this trend of one technology shortening the time to the arrival of the next is not a smooth process without any bumps or detours. Rather famously, when Apple announced they had purchased a Cray supercomputer to aid in the design of the next Macintosh, Seymore Cray remarked that, "he thought it was odd that Apple bought a Cray to design Macs because he was using Macs to design Crays." Still, the acceleration of technology is undeniable, and it was this observation that led John von Neumann to use the term "singularity" to describe where he believed this was all heading.

If you are not familiar, "singularity" is a term from mathematics that describes a place where "the math doesn't math". Take, for example, the function 1/x. If x is positive, then as it gets smaller 1/x gets larger. If x is negative, then as it gets larger 1/x gets smaller. These two trends collide at x = 0 where it is impossible to describe what 1/x means...because this is a singularity. So when von Neumann described the "technological singularity", he was pointing out that if each new technological advance shortens the time to the next technological advance, then at some point the "time to the next technological advance" would approach zero. He did not make any predictions about what this would mean, only that "the math doesn't math" when we reach that point.


In my time at Apple, there was a Keynote slide that repeatedly made an appearance in internal presentations. It was the "Timeline of Apple" slide that plotted major advances in Apple's history along a long horizontal line. It starts with the Apple I, the Apple II, the Macintosh, and so on. When I first joined Apple, the far right side of the line ended at "Intel Transition". By the time I left Apple, that slot was occupied by "iPhone" and within a year of my leaving (though I was no longer privy to the slide) I'm sure it would have said "Apple Watch".

Before the Apple Watch was released, but after such point that its arrival was widely expected, I found myself walking and thinking about the history of computers. From the ENIAC that occupied an entire room, to the IBM 360 that required a handful of cabinets, to the Macintosh that could sit on a desk, the MacBook that could be carried under your arm, or the iPhone that fit in your pocket, miniaturization seemed to be the rule. But much like von Neumann could not see past the point of "zero time to next technological advance", I could not see past the point of miniaturization to the level of wrist watch. What would a smaller computer look like? Would it even be useful?

It was then, as I was walking, that I had something of an epiphany. I had been thinking about the advance of computers all wrong! It wasn't about size.

It was about availability.

In the days of ENIAC, if you wanted to use a computer you would have had to walk to the building housing it, and there were only a handful in the world. If you wanted to use your company's IBM 360, you would have had to walk down the hall to the room where it was set up, and each company might only own a small handful. By the time we reach the desktop computer, using one only requires sitting at your desk. A laptop you can carry in a backpack. Your phone you still have to remember to put in your pocket, but a wrist watch can be with you throughout the day without much thought at all.

Framed in this way, the path forward was clear to me. What if you didn't have to pull out your phone to get directions to the last location you looked up, but instead your car could show them to you as soon as you sat in the driver's seat? What if the shopping cart at your local grocery store could display the contents of your shopping list that you wrote down at home? What if, instead of a long list of all the arrivals and departures, the screens in an airport terminal could recognize your face and show you your exact gate and departure time?

I cannot, yet, claim to know what will happen when we reach the technological singularity. I only know that all the science fiction tropes of the Skynet or Matrix variety are not likely to pan out. I suspect that the reality might look a lot closer to the world of Questionable Content, a long-running web comic wherein the human characters live and interact with AI powered androids and...mostly deal with typical relationship drama and the minutiae of day-to-day life.

What I do know, what von Neumann knew, is that there is no point in fighting the coming singularity. It is not here yet, but, as all exponential curves go, it is closer now than it has ever been.

Top comments (0)