DEV Community

Jacques Williams
Jacques Williams

Posted on

The Idea Behind Time Complexity

What is time? And how is it "complex"? Well, the answer to that question is a long one. To put it simply, time is relative. Meaning, that time is measured depending on the circumstance. For example, the time in New Orleans may be 7 pm., while in Florida, it may be 9 pm. So in this case, time is measured based on the time zone you're in. Another simple example to consider would be a race between two people. Say we have a referee on the side that's recording their speeds with a stopwatch. In most cases, one person will turn out to be faster than the other. That person's speed will be measured at a faster time rate on the referee's stopwatch. So, looking at these two examples, we can see that time is measured differently depending on the circumstances or variables at play. Now let's dive back into our home world, "Computer Science"! Computers, like the stopwatch, have their own "time complexity". If you're a javascript master, you know fully well that there are many ways to skin a cat. Meaning, that there are tons and tons of ways to find solutions when building algorithms! And the speed that it takes for the interpreter to run all those functions, varies. Since we're dealing with computers, We refer to this as computational time complexity.

To represent computational time complexity, we use a specific notation known as "Big O". Just like anything else, while measuring time, there are always factors to consider. With functions, one of those factors being the size of the input taken in. Another would be the number of operations performed within the function. And to go even further out, another factor would be the amount of storage, and the type of processor your computer has. For example, let's take a look at these two functions.

Alt Text

When measuring the time it takes for a function to run, we use a notation known as "Big O". Here, we have an example of what would be considered a "constant" time lookup because our function essentially performs one operation each function call. The syntax is O(1).

On the other hand, what if we didn't know the actual index position that holds my name? Lets say we had to traverse the array to find it.

Alt Text

Here, we have what's considered a linear time lookup. In order to fully run the function, the interpreter has to increase it's lookup step by step based on the for loop. Each iteration, the time complexity increases "linearly" The syntax is for this example, is O(n).

There are a couple of other keywords to represent the time complexities of more complex code. Remember, the size of the input itself has to be considered! If I used an entire "phone-book" object as an input for example, you could imagine how long the computer will take to process it alone (not to mention run the function itself). For example, A logarithmic time speed, O(log n), gradually increases, but at a seemingly decreasing rate. However, for the sake of this article, we won't dive too deep into that.

In short, time is complex due to the factors at play. For computers, more specifically Computer Science, time is represented with a notation known as the Big O. Just like anything else, functions have different qualities based on their construction. Therefore, each function maintains their own time complexities.

Top comments (0)