DEV Community

Cover image for Your software is lying to you
Anshuman Khanna
Anshuman Khanna

Posted on

Your software is lying to you

A very nice thing about our modern languages is that they abstract a lot of the stuff that used to be done manually. However, they didn't automate it, the abstracted it and that's an issue that you don't realize.

Now firstly, let's make the difference clear and why it matters.

Let's say I have to allocate memory in a program. This is how C does it:

    int* arr = (int *) malloc(size * sizeof(int));

    if (arr == NULL) {
        // at this point *errno* is populated which can be seen using this.
        printf("Error occured during allocation: %s\n", strerr(errno));

        // This error is most probably: ENOMEM which means that there is no memory to allocate.
    }
Enter fullscreen mode Exit fullscreen mode

Now, if I study C, I will learn that memory allocation can fail and this is the error it returns. Moreover, I also learn why does this error happen. It is because your hardware has two kinds of limits. The first limit is a soft limit that your kernel enforces and then there is a hard limit that is the ceiling of your soft limit. So, what I learn is that kernel allows us to allocate memory in soft limit -> [0, hard limit].

Now let's see what does Python do:

    a = []
Enter fullscreen mode Exit fullscreen mode

This is what you would do because this is what the Python docs tell you to do. Now let's see what you actually need to do.

try:
    my_list = [0] * size
except MemoryError:
    print(f"Error: Could not allocate a list of size {size}. Ran out of memory!")

print(my_list)
Enter fullscreen mode Exit fullscreen mode

But will Python tell you this? NO. Why not? Because it hides all this from you to give an illusion of simplicity until one day that you get a crash and have to manually go down the rabbit hole trashing your years of knowledge.

The same is true for any language that let's you do dynamic memory allocation without realizing that it's dynamic memory allocation.

malloc() in C is an automation of assigning data which is harder to do in Assembly. It still tells you the risk and doesn't hide them. Python dynamic allocation is an abstraction that doesn't tell you anything.

Now it's not just about dynamic memory allocation, it's about any dynamic behavior that you can think of. Dynamic behavior is not something that can be optimized very easily because there isn't much information about how to optimize it.

The only way to optimize dynamic behavior is that you provide information, as much information as possible. That's why we have static types in languages. However, someone seeing all the dynamic behavior thought what if we could have a language with types at compile time but not at runtime and which completely makes having types useless.

Type annotated languages like Python3 or languages with non-static typed targets like TypeScript compiling to JavaScript are even bigger liars. You are writing types but there's no real point to it. In the compiled output, your types don't exist.

So you added types to make it easier for you to code but it made no change to the performance of the code (unless you make that change) because the machine will still run the code with the same level of optimizations as before as it is still getting the same dynamic code at runtime.

Last point, Fancy APIs. Many modern languages have Fancy APIs that make it easy to do stuff such as creating a web server because it takes care of a lot of things that you had to manually previously.

As good as it may feel. But let's say you made a web server in JS, you don't know anything about an HTTP server even now. You don't know what all it had to do under the hood so that you can write your code. You don't know what the machine actually did because all you see is your Fancy API.

I am not hating on these features. These are good features and their existence matters when you just want to get the work done. My point is that you must understand what your software is doing under the hood for you instead of just thinking in the way your software wants you to. That way you'll be independent of the software.

Note: This is a raw article, I am not going to reread or edit it, so if you find any faults in it, just create a PR.

Top comments (0)