Originally published on Medium - 06-20-2018
In 1837, the first computer was theorized. Yes, nearly two centuries ago the Analytical Engine, conceived by inventor and scientist Charles Babbage, set up the instructions for the the first physical model of a computer. But Babbage was only scraping the surface. He had intended for the device to be able to handle calculations. His Difference Engine, a device meant to calculate polynomials and print them out had taken to long to build. By 1832 the project had been shelved due to a dispute with the engineer on the project, and the government pulled funding in 1834. Having lost interest, and fueled to build a better machine, Babbage began designing the Analytical Engine.
The Analytical Engine
This device was a major step forward. The ability to make arithmetic calculations and implement logic/control flow with onboard memory was revolutionary. The device had a place for numbers and results to be stored and a separate place for arithmetic processing. It was the first example of punch cards being used to program a “computer”. But it had one major design flaw. It was massive. While the Analytical Engine had been designed to be revolutionary, and it was so revolutionary that it did not come to fruition during his lifetime.
While the Analytical Engine was never completed, Babbage is widely credited as one of the first computer scientists. Ada Lovelace, a writer, mathematician and friend of Babbage, even wrote the first algorithm that could run on the machine and she is considered to be the first computer programmer. Unlike Babbage, she saw the ability for devices like his to do more than just calculate and focused on examining the relationship between technology and humans. Can you imagine what she might think of Uber?
These questions lay mostly dormant until the 20th century when Alan Turing theorized what came to be known as a Turing machine.
A Turing machine is a hypothetical machine that can recreate any algorithm. It works by having a machine above what can be an infinitely long tape. There are many boxes along the tape that contain “0”’s, “1”’s, or are blank. The machine operates by writing one of these three to a square and then moving on. By following a set of given instructions, we can execute any computer algorithm. In a paper with his mentor Alonzo Church, Turing theorized that anything that can be computed can be done by a Turing machine.
Now, because of the wealth of technology to us, this doesn’t seem like that big of a deal. Yet, in the mid-19th century, this idea was groundbreaking. It literally set up the framework for you to have apps on your phone, music on your iPod, and downloadable memes. Storing programs on a computer became a reality in part, because of the Turing machine. In fact, machines that follow the requirements of a Turing machine are said to be Turing complete.
This model, besides defining the limits of human computation as being equal to the Turing machine’s abilities, forms the basis of all widely used programming languages in existence. Sure, some computers are more elegant and perform faster, but the underlying structure of Turing completeness is what defines the computational model.
Turing did not stop there. He developed the Turing test, a test that looked to see if a computer could be indistinguishable in communication compared to a human being. This, in part, started the field we’ve come to know as Artificial Intelligence.
CAPTCHA, the online test that prevents bots from accessing sites, runs on the idea of a reverse Turing test. The computer judges a human who attempts to prove they are human in contrast to the opposite. It’s an acronym for “Completely Automated Public Turing test to tell Computers and Humans Apart”.
John Von Neumann, a mathematician and computer scientist wrote about a the internals of a computer, where data and programs are stored in the memory, in the same space. Here, the idea of the modern computer had come full circle.
These four are known as some of the first to design and implement on the computational model that forms the basis of systems we use today. Not long after, FORTRAN, the first high level language, gained popularity and the foundation paved by these pioneers had become a reality. From Plankalkül to Malbolge, from ALGOL to Ruby , from Autocode to R, these concepts defined the ability to program.
“‘HELLO WORLD’ in Malbolge (named after the 8th circle of Hell in Dante’s Inferno)”