DEV Community

João Eurico "Bokomoko" Lima
João Eurico "Bokomoko" Lima

Posted on

A really very much indeed stupid machine (but a fast one)

Have you ever considered that, for most practical purposes, most of the humans beings on the face of earth today have almost instantaneous access to a computer? Whether it is a proper standard computer with VKM(video/keyboard/mouse) or a smartphone, tablet, touchscreen equipped notebook. Computers are ubiquitous. Have you ever wondered how this computers work?

Here is a thing: As much as a computer device looks "smart" it actually is really dumb. But a damned fast one.

Let´s imagine a very simple machine that only does very simple things, like following a very small set of instructions. Things very basic like :

0 - following instructions one at a time, sequentially
1 - Adding two integer numbers
2 - invert the sign of an integer number
3 - Comparing a number to zero
4 - storing the result of the add operation in a safe place (in memory)
5 - reading the result of the previously stored data (from memory)
6 - jumping to a certain instruction
7 - jumping to a certain instruction only if the last comparison to zero was true. Continue normally if it wasn´t.

These instructions don´t look very hard to follow, don´t you agree? But that´s all one need to do
everything a modern computer does. Actually, there´s an even smaller set also capable. But how? How can these basic instructions play videos, process payments, book flights, connect people all over the world, even think much alike a physician and give the correct diagnostics 99 out of 100 times?

Just by doing these things extremely fast. Real fracking awesome stupid unthinkable fast.

All you have to do is provide the computer with the correct order of instructions. Or ... as we nerds use to say ... program the computer.

Hmmm, not convinced yet? Let´s take a small inductive pass and then you can extrapolate.

First, let´s think of a simple task although extremely useful: subtract two integer numbers! Hey! You didn´t even noticed that our computer is so dumb that it didn´t even know how to subtract! It knows how to add! And to change the sign of a number! Turn a positive number into a negative and vice-versa. Aha! Now we can subtract! How? Let´s program the computer to do so. Using only that list of instructions above

First we retrieve from the memory the number we want to subtract from. Instruction 5.
Then we retrieve the number we want to subtract. Instruction 5 again.
Now we invert the sign of the second number, turning it into a negative. Instruction 2
Now we add the two numbers. Instruction 1
The result of the operation is stored in memory. Instruction 4.
Enter fullscreen mode Exit fullscreen mode

That´s it ... basically. I´ve left out the "input" (how come the number was already in memory?) and "output" (how will the humans read the result?) of the data for didactic purposes.

What just happened is that we just wrote a small program for our very dumb theoretical computer and now we can add subtraction to it´s skills. Let´s call it, for simplicity sake Instruction 8, subtract.

Now the computer not only knows how to add, subtract it also knows how to count. How can the computer do it? Let´s follow the "program"

1st: Retrieve the number to count from memory (inst 5)
Subtract 1 from it (inst 8)
compare the result to zero (0) (inst 3)
If it´s zero, jump to the instruction X: (inst 7)
jump to the first instruction 1st: (inst 6)
X: we´ve finished couting
Enter fullscreen mode Exit fullscreen mode

What we´ve get is a "loop", a way to repeat instructions many times, just by counting how many times it has to be done.

So let´s add this new thing that we´ve programed as another instruction (inst 9), repeat something N times, where N is the number we intend it to be repeated. Looks silly, repeating something almost the same way. But it is very useful, let me show you.

So far, our computer can add and subtract. But can it multiply?

Remember: a multiplication is just a repetition of many adds.

Suppose we want to multiply a number by another. How would we program our computer? Let´s do it:

Retrieve one of the numbers to multiply (inst 5)
Retrieve the other number to multiply (inst 5)
again: compare the second number with 0 (inst 3)
if it´s zero, jump to instruction done: (inst 7)
add the first number to itself (inst 1)
subtract 1 from the second number (inst 8)
jump to instruction again: (inst 6) 
done: store the result into memory (inst 4)
Enter fullscreen mode Exit fullscreen mode

By applying the same approach we can implement (program) the computer to calculate divisions. If we can add, subtract, multiply and divide we can surely calculate exponentiation and by combining this simple operations we can start doing some complex calculations, combining them even further we can do really complex stuff.

The only difference (besides actually existing in real life) and any other computational device on the face of earth is the instruction set, speed and peripherals attached to the real computer. But they all work just like the same way our theoretical computer works.

For the last 70 years, computer scientists and hardware engineers have equipped computers with ever more complex set of instructions, either embedded in the circuitry or via programming (software).

Today, computers can be programmed in such complex ways that enable them to mimic human thinking process. That´s Artificial Intelligence.

And everything is based on that very simple set of instructions that dumb computer understands.

Top comments (0)