DEV Community

George
George

Posted on

Thinking like a computer, before programming a computer

Let's take a massive step back and think about one thing. In one way or another, every line of code that we write, compile and goes live somewhere at the end of its chain is then turned into machine code aka binary. So let us ask our selves why.

How a computer works

First of all, computers don't (well can't) think, they execute. In short terms it all comes down to the machine executing a series of instructions, if these are in the correct order then we get the results we're looking for. If they're not then we go through debugging phases until its fixed or we go nuts. Any given task has to be translated from the step sequences in order for it to work.

Computers are dumb and smart.

They're designed in a way that forces them to generate an output for a specific input. Let's think, what really is a computer or a server? It all starts from some materials found in the ground. They're dumb because they can really do basic math and binary (aka 0 and 1). However, we can easily argue that they're smart, because we're taking hunks of metal that is found in the ground and making it into a machine that pretty much every job day to day will use in one way or another.

Why do we need to think like a computer?

Programming is a very small part of computer science. It is based on the principles of computing, which is an essential part of any computer science related major, by applying these new methods of thinking it can help us become better programmers.

Behind every line of code, there are layers of information that make it possible for things to work. I've found that taking the information has helped me improve aspects of complicated systems, even without an understanding of each part.

I've come to realise that a lot of the time it's simply not enough to just know how to program a language in some areas. While working on embedded systems, back-end networking and more it suddenly hit me that I needed to know what was going on behind the scenes, that the reason why some of my code may be failing could be a logical error that other programmers can spot because they have this knowledge.

Principles of computing, what is it?

In short its the mathematical side behind computers. For a couple of years I've been learning the principles behind computers, and originally thought I'd never end up using this knowledge. But as my course developed and became more advanced I found myself thinking to myself, if I used X equation or X strategy from that module perhaps my code could work better? Perhaps it could fix a bug? And perhaps if I knew the logic behind a computer's calculations that I can use that to my advantage.

The principles do include a lot to learn though, from this semester we're taking on

  1. Propositional logic
  2. Predicate logic
  3. Sets, functions and relations
  4. Counting (Not pre-school level btw)
  5. Graph Theory

How should I learn this?

If you're in school, college, university etc and you're studying (or going to be) computer science or related course then it will more than likely be covered in your course, it's likely to be a different curriculum than the one listed above.

If this is not the case here are some resources that you're able to do online:

Coursera: Principles of computing
Code.org Computer science principles
Stanford online Principles of computing

Final advice

This section can relate to every part of the development process, to be honest. However, there are five elements of development that I personally follow.

1. Strap in and don't stop until the job is done to your satisfaction

You can spend hours or even days fixing one error, and have to it turn out to be missing a semi-colon. The bad news is if its wrong the computer won't run it. After many years of development I've learnt that giving up is NOT worth it. The development industry is one where you consistently learn. Code that isn't working is a great thing because you're teaching yourself what is wrong and how you can fix it.

2. Love the pain and don't fight it.

I think we can all agree that life in general and writing code at a time sucks. But we all have sucky things that we deal with and work through. The best bit? We do it for ourselves, we do it because we want to learn more as we progress and because we love to programme (also the paycheck is pretty nice).

The best we can do is think about how we can reframe the way we're taking on an issue and not wanting to throw it out the window. The feeling of having something work after

3. Remember why we write code

For everyone here, there's a different reason to why we write code. For work, for fun, for education and much more. But there is always that personal reason behind why we decided to become developers, and often people struggle to find motivation for a certain project or to get their work done. Often reminding yourself of WHY you wanted to become a dev, and what originally got you started is what can give you the boost you need to continue on with the project.

4. Asking for help is good

You will not find a single dev that hasn't asked for help in their career. There will always be someone who is better than us and knows more than us but will still ask for help because we may know something that they don't. It's more common than people think, got a bug that you can't fix? Ask another dev they may of had similar before. Everyone has different experiences within the dev culture, and those experiences can be used to everyone's advantage.

Latest comments (2)

Collapse
 
tux0r profile image
tux0r

They're dumb because they can really do basic math and binary (aka 0 and 1).

So they can already do more than some of their users. And the ternary ones excel!

Collapse
 
somedood profile image
Basti Ortiz • Edited

Remember why we write code

I sure will. I write code because I want to write software that will help others. I have to constantly remind myself that when coding becomes too much of a chore.