DEV Community

Ardi
Ardi

Posted on • Edited on

Don't learn C

There is a murmur in tech twitter lately that you should just learn C in order to learn how the computer really works. Don't do that. Read a book.

It is a well known fact that most programmers have a brittle understanding of their computer. They sit nicely in their ivory tower of abstraction, building very complex systems while completely disregarding all of the tech stack they're sitting on. Thanks to the absolute dominance of programming languages with automatic memory management many of the programmers of this group have never had to think about memory, and then they are surprised when their pretty little nodejs consumes gigabytes of memory.

Recently there has been discussion that to solve this you should just learn C. C is supposed to be low level, you manage your memory yourself, you code for the hardware - you tell C what to do and it translates nicely to machine code that you can understand and be happy in a rainbow world full of computer candy :)

WRONG! You don't code for the hardware - you code for the Abstract Machine. What your program does is most definitely not what the hardware does!

What are you talking about?

What do you think happens when you have a char - an 8 bit signed integer - with the value 127 and you add one to it? If you paid attention in class you might know that with 2's complement it should loop around to -126. And you would be right! That is, if C actually cared about the hardware. It turns out that for historical reasons that is undefined behaviour(UB), which means that it could actually wrap around, optimize your code, or even make demons come out of your nose.

Or if you have an array and you mistakingly access out of bounds, that's also UB. In fact, there are A LOT of behaviours considered undefined in C that you could code in assembly (probably by mistake though).

It turns out that what you tell C to do is not what the CPU will end up doing because C has an optimizing compiler. Most of the time it will end up just tweaking what code it outputs keeping its behavior the same but there are times where it will take some liberties. While you are learning C you have to learn about all of these quirks and footguns, and you're not really learning that much about the computer itself.

There are a lot of web developers that don't understand how memory works and that should be addressed, but telling them to just learn C is not enough. Having a basic understanding of how the computer works can definitely be achieved by some basic reading and yes - maybe write some C, implement memcpy and a toy malloc but C shouldn't be the goal, only the medium.
Read a book that teaches you these concepts, don't just learn C.

The same can be said for operating systems but honestly even Node.js allows you to work with raw file descriptors and buffers. There is indeed value in working with low level APIs, I think all programmers at some point in their life should manually read(2) from a file, at least to have an idea of how it works.

But what do I know? Most programmers don't know and don't care about what goes on when they fetch() and they seem to be doing fine. I always feel like an old man screaming at the sky but I think it's sad that people have such a shallow and brittle understanding of the systems that they use. But when they venture out to the deep waters, to the prohibited knowledge that lies below they should probably be hand held by a good systems book.

Top comments (0)