DEV Community

NotMe
NotMe

Posted on

The Deep Magic of Computers Pt.1 - Introduction

"The Deep Magic of Computers" is a 10-part series covering computing from its earliest forms to what we currently recognize as a computer. This introduction will start with computational history and by the end of the series, we will be talking about how applications are made. Each part of the series correlates with other parts, but are each self-contained. No prior understanding is necessary.

"Any sufficiently advanced technology is indistinguishable from magic.” – Arthur C. Clarke

The Great Magic

Computers have this magical quality to them. Video games are a great example. You get to play as a character inside a world that doesn't exist, yet you become attached to the characters, story, and artwork as if the world were your own.

What about the internet? Information travels at the speed of light across the globe, connecting the entire world within milliseconds.

Or how about the routers that the internet runs on? They have to carry all that data around.

The fact is, computers seem like some kind of magic trick. We expect them to just work, and when they don't, it frustrates us.

Some are content with simply using the magic that has been bestowed upon us by the great wizards of time, but others of us want to become great wizards ourselves.

I know I do.

The idea of bending the magic of computers at will enthralls me. The crazy part is, that anyone can do it because, while it may seem like magic, it's really just one amazing feat of engineering and science after another.

I want to pull back the layers of each amazing feat to understand how the magic works. I want to understand how electrical signals can give us entire worlds we can build as game developers, or how those electrical signals become information we consume as YouTube videos.

To do this, I'm putting together a quest to find out how the magic actually works. Once we know, we will be on our way to bending the magic ourselves, onto the path of becoming a great wizard!

From my research, I've been able to pull back the layers into ten parts.

  1. Physical
  2. Transistors
  3. Digital logic
  4. Micro-architecture
  5. Machine Language
  6. Assembly Language
  7. System Software
  8. Middleware
  9. Programming Languages
  10. Applications

Each level has its own magic that builds on the last, yet each exists on its own.

It's when we layer them in a particular way do we get the magical machine itself.

Understanding each layer will allow us to freely choose what kind of wizard we want to become, and how to bend it to our will.

Origins of Computation

We can trace the magic back to the origins of computation when we used an Abacus1 to calculate all sorts of things.

We also came up with the Slide Rule for multiplication and division.

The idea with these and many other inventions was to create a tool that would make calculating efficient.

Generally, people who did these calculations were thought of as computers, given it was a job title.

The title of computer shifted from people to devices with the advent of devices, such as Gottfried Leibniz's 2 "Step Reckoner", a device that mechanically added, subtracted, multiplied, and divided numbers.

Leibniz's design went on to be the predominant design for calculators for a few centuries.

For those who could not afford anything akin to a step reckoner, per-computed tables were the way to go.

Pre-computed tables were sheets of paper that had a table of pre-computed results for a given problem set.

Instead of trying to find 11 * 88 on your own, a pre-computed table would be able to tell you quickly.

These were generated by human computers, mathematicians, and were also known as "look-up" tables.

The First Wizard

Charles Babbage was an English polymath, well-versed in math, philosophy, inventing, and mechanical engineering. He is credited as the "father of computers" for inventing the first mechanical computer, the Difference Engine.

Difference Engine was capable of approximating polynomials, which allowed for the automatic calculation of logarithmic and trigonometric functions that were a pain to calculate by hand.3

Inspired by his work on the Difference Engine, Babbage came up with the Analytical Engine. Unlike every computational device before it, the Analytical Engine was a "general-purpose computer."

It could be used for many different types of computation, instead of specific ones. It could be fed data and run operations on that data, had memory and even a primitive printer.

The idea that an automatic computer could guide itself through a series of operations was far ahead of its time, foreshadowing modern computers.

Ada Lovelace, an English mathematician credited as the "first computer programmer", wrote hypothetical programs for the Analytical Engine.

A new, vast, and powerful language is developed for the future use of analysis. - Ada Lovelace

Arguably, the Analytical Engine would go on to inspire the first generation of computer scientists.

From Mechanical to Electrical

The Analytical Engine was a marvel, and had it been built to completion, it would have been considered digital, programmable, and Turing-complete4. However, it would have been extremely slow.

"Mr. Babbage believes he can, by his engine, form the product of two numbers, each containing twenty figures, in three minutes". - Luigi Federico Menabrea

By comparison, the Harvard Mark 1 could perform the same task in six seconds, and modern computers can perform that calculation within a billionth of a second.

The Harvard Mark I was an electro-mechanical computer presented by Howard Aiken in 1937 and was used during the war effort in World War II5.

At its core, the Harvard Mark I was composed of mechanical relays (electrically controlled mechanical switches) that would either open or close the circuit to let electricity flow.

A good way to think of it is like a water faucet; open the handle to release water flow, close it to shut it off. Relays do the same with electrical flow, instead of water.

These types of computers were a great leap forward from the Analytical Engine, but they had their own problems.

Firstly, the mechanical arm of a relay has mass, and can't move instantly. Therefore, the calculations it could do were more than the AE but limited (50 times a second). It could not be used to solve large, complex problems.

Secondly, there was a lot of wear and tear on these kinds of machines. The Harvard had roughly 3500 relays. The more moving pieces, the more likely things would break down.

Eventually, these relays got upgraded to vacuum tubes, which operated like relays, but had no moving parts. In addition to solving the wear and tear problem, they also increased how many calculations can be done a second (1000 times / per second, roughly).

Vacuum tubes would go on to power telephone, radio, and many other electronic devices. They were used for more than half a century, but they did have their faults.

Firstly, they are incredibly fragile. It didn't take much to break them.

Secondly, like light bulbs, they could burn out and often did.

Thirdly, they were incredibly expensive.

The first computer to use vacuum tubes was the Colossus Mark I, developed by Tommy Flowers, which went on to help decrypt Nazi communications during World War II6.

It is composed of 1,600 vacuum tubes and is regarded as the first programmable computer. The programming was done by plugging hundreds of wires into a switchboard, similar to the old-school telephone switchboards. This allowed scientists to set up the computer to perform the correct operations necessary.

The downside of Colossus was it would only perform specific calculations once programmed and needed to be reprogrammed to do other calculations.

Enter Electrical Numerical Integrator and Calculator (ENIAC), the first truly general purpose, electrical, programmable computer. It could perform 5000 operations per second and was operational for 10 years.

Towards the Future

By the late 1950s vacuum tube computing was reaching its end. Too often they would have to replace vacuum tubes due to burnout or broken tubes. This would generally leave ENIAC only useful for half a day when calculations could take weeks to complete.

In 1947, scientists John Bardeen, Walter Brattain, and William Shockley took the idea of vacuum tubes to a whole new level. They sought to reduce the cost and size while increasing the reliability and speed. Hence, the transistor was born, a whole new type of switch that relies on Quantum Mechanics.

In the next part of the series, we'll explore how transistors came to be and the materials used to create them!

  • Glitchbyte


  1. An abacus is a hand-operated calculating tool using beads to represent numbers for basic arithmetic. 

  2. Fun fact: Leibniz was a mathematician-philosopher who strongly advocated for the binary system since they were ideal for machines. Even so, the Step Reckoner used the base 10 system, instead of base 2. 

  3. Babbage never saw the Difference Engine built as he drew it. It required too many parts and, as a result, was entirely too expensive to produce. In 1985, a museum curator found the drawings and decided it was now possible to build one. Using the original drawings Babbage had created and 7 years' worth of work, in 1985 scientists built a working Difference Engine. The machine contains 8,000 parts and weighs about 5 tons. 

  4. In Computer Science, the term "Turing-complete" refers to machines that can simulate any Turing machine. A Turing Machine is thought of as an endless piece of tape that has a read/write head and can read the marks on the tape usually zeroes and ones. Anything that can do this (programming languages, for example) is considered Turing-complete. 

  5. One of the first programs to run on the Mark I was initiated by John von Neumann. While working on the Manhattan Project, von Neumann needed to calculate whether an implosion was a viable choice to detonate the atomic bomb. 

  6. Didn't Alan Turing create the first computer? Technically, no. Turing did create a device similar to the Colossus Mark I two years prior, but it was an electro-mechanical device that didn't quite qualify as a computer. 

Top comments (0)