DEV Community

Cover image for A History of Computing: What led up to AI?
Connor Anastasio
Connor Anastasio

Posted on

A History of Computing: What led up to AI?

"I believe that at the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted." - Alan Turing, 1947

The recent and rapid growth of technological advancements has been both unprecedented and seemingly endless. Cloud computing has forever changed the way we conduct business. Augmented Reality has impacted the way we shop, learn, and interact with the world around us. Smaller, faster, and more efficient CPUs have made trivial many calculations that would’ve been deemed impossible to compute even 20 years ago. And yet, Artificial Intelligence (AI) has been able to distance itself from other seemingly similar breakthroughs; without question, it is the biggest technological advancement since the invention of the digital computer itself. It is the culmination of thousands of years of scientific discovery and innovation by some of the world’s most intelligent dreamers. It is worth taking a moment to reflect on how we got here.

Origins and the Birth of the Analog Computer

AI would obviously not be possible without the invention of computers. With a history dating back at least 5,000 years, the oldest computer we know of is the Abacus, a simple tool used for basic math and counting operations. Many civilizations had their own version of the abacus, but the principle was generally the same: a rod was constructed with small stones or beads wedged through or on top of the rod, able to be freely slid around. These were then typically stacked on top of each other to allow working with larger numbers. It sounds like it would be little more than a counting tool, but Ancient Mesopotamia and China are known to have used it for addition, subtraction, multiplication, and even division. While it was pivotal in developing society, it was completely manually powered and only capable of storing the information that was given to it.

The first “true” technological advancement for computing was the invention of the Analog computer.
An analog computer is a mechanical device that is developed to perform a single specified task using wheels, cogs, levers, or slides. They differ greatly from modern computers in that they (as one could probably guess from the name) are not digital. Analog computers are not reliant on the use of 1s and 0s, so they do not have to worry about floating point rounding errors. They also do not have “computing times” for executing their tasks, as the calculation is physically part of its construction: it is “calculating” its answer as the user is setting its configuration. Since they are not bound by the same limitations as digital computation, they can theoretically have near-infinite accuracy. The engineering and ingenuity that goes into manufacturing the most sophisticated analog computers, such as The Antikythera Mechanism, is almost incomprehensible.

The Antikythera Mechanism shown next to its fully functional reproduction

The Antikythera Mechanism is the oldest surviving example of an analog computer, and one of the most remarkable. Constructed by the Greeks roughly 2150 years ago for the purpose of learning more about the heavens, it is a welcome reminder of just how smart our ancestors were. A complex configuration of more than 60 cogs and gears was meticulously laid out to achieve something both extremely helpful and extremely easy to use. All the user had to do was set it to a date in the future and would be instantly provided with information on astronomical phenomena such as the positions of the sun, moon, and planets. It was so sophisticated that it was even capable of predicting solar and lunar eclipses. Unfortunately, the device relies on having an accurate understanding of the motions of these celestial bodies, so the device did not give accurate data when looking too far into the future. However, research into its design has shown that if it had been constructed with correct information about planetary orbits and movement in space, it would have been accurate to about 1 degree off every 500 years.

Analog computers dominated society for the better part of the next two millenia. As useful as they were, they still had several fundamental problems: they could not be repurposed for other tasks, could not store information from previous uses, and were reliant on the accuracy of their construction to provide correct results. Inevitably, a solution to these issues was developed in the 1800s.

The Father of Computers

Charles Babbage was a prominent 19th-century mathematician, inventor, and visionary whose contributions laid the groundwork for all of modern computing. His inventions and ideas revolutionized the concept of computation, paving the way for the development of modern day computers. Born on December 26, 1791 in London, Babbage displayed an early aptitude for mathematics and mechanics. He attended Cambridge University, where he earned a reputation for his love of building machines. Despite many contributions in the fields of cryptography and number theory, he is best known for two of his inventions: the Difference Engine and the Analytical Engine.

Created in the early 19th century, the Difference Engine was designed to perform algebraic calculations automatically, eliminating any risk of human error. It used a series of gears and wheels to perform addition and subtraction of polynomial equations. Babbage spent the majority of the 1820s and 1830s developing and improving the Difference Engine, even receiving funding from the British Government to continue his work.

Ever the ambitious man, Babbage’s dreams extended beyond simply performing addition and subtraction quickly. He envisioned a much more sophisticated machine, capable of making his Difference Engine completely obsolete. His new project, the Analytical Engine, was unfathomably ambitious for its time. A steam-powered reprogrammable computer, it had many of the same features we’d expect to find in a computer today: an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory. As described by Babbage himself, it was a “machine capable of eating its own tail”; a device able to modify its given instructions even in the middle of performing a task. Its scale and complexity was far ahead of the rest of the industry, meaning the actual construction proved too big a hurdle.

Babbage's 'Trial Model' of the Analytical Engine

Believing the British government wasn’t providing adequate support, he gave lectures on the device throughout Europe in the hopes of finding better funding. Ultimately, nothing came to fruition and the Analytical Machine was never built. Thankfully Babbage had recorded the entire construction process, build specifications, operational capabilities, and even a user manual in painstaking detail. This has allowed future scientists to confirm that yes, unbelievably, the Analytical Engine would have worked as intended. He had somehow created the world’s first digital computer without even having built it. Many future scientists would reflect back on Babbage’s overambitious and seemingly impossible goals with fondness; most notably, Alan Turing.

Innovation to End All Innovation

Alan Turing is a man whose name should be as synonymous with modern computing as Bill Gates’ is. To put it bluntly, we used Turing’s research to put a man on the moon before 1970, bringing the final chapter before the Modern Era of Computing to a close.

An immensely talented mathematician, he is arguably the single human responsible for saving the most lives during WWII. During the war he developed the “Turing Bombe”, an analog computer used to crack the German’s Enigma code, saving potentially millions of lives by shortening the war an estimated 2 to 4 years. After the war ended his developed taste for computing was insatiable. Turing's seminal work introduced the concept of a "universal machine" capable of mimicking human reasoning. Turing’s colleague McCarthy coined the term "artificial intelligence" and organized the Dartmouth Conference in 1956, a significant event that marked the formal birth of AI as an academic discipline.

By the 1950s Turing, McCarthy, and several others had improved computing technology to the point where they found themselves able to ask:
“Is it possible to create machines that work through problems like humans do?”

The question vexed many of our smartest minds. Nearly 70 years later, we can definitively say that yes, it is possible. We may have even underestimated just how possible it is. AI is only in its infancy and it seems as if the sky is truly the limit. Computing may have had an interesting past, but its future will be a very enjoyable to experience.

Sources:

https://blogs.bodleian.ox.ac.uk/adalovelace/2018/07/26/ada-lovelace-and-the-analytical-engine/

https://www.kythera-family.net/en/history/artefacts/the-antikythera-mechanism-on-display-at-the-nicholson-museum

https://en.wikipedia.org/wiki/Alan_Turing

Top comments (0)