In coding, as in life, there's always more than one way to approach problems as long as you're creative enough. But how would you know which one of the bunch is the best?
Note: Also, keep in mind that the best is a relative term. You define this according to what your needs are. Faster? Less memory use? More readable?
That's when "the Big O notation talk" comes into the picture.
Basically, it’s a way of generalizing our code and comparing it and its performance to other pieces of code. With big-O we can talk formally about the runtime of an algorithm as the inputs grow.
It's important to note, that we care about the trends and not details.
Knowing the big-O of our code allows us to:
- Have a more precise way to talk about performance
- Lets us identify parts of the solution that might be inefficient
- Help us discuss trade-off of several approaches
We count the number of operations that our functions need to be executed.
More formally, we can say that algorithm is O(f(n)) if the number of simple operations that we have to do is eventually less than a constant time f(n) as n increases.
f(n) means it could be linear, f((n)=n^2 ) means it could be quadratic, or f(n) =1) means it is constant.
An O(1) would mean that as n grows, it has no change reflected in runtime.
An O(n) operation inside of an O(n) operation equals O(n^2 ). (Nested loops are a great example of this).
Regardless of the exact number of operations, they tend to grow roughly proportionally with n. So if n doubles, the number of operations will also roughly double.
There's something important to take into consideration to simplify a big O. At the end of the day, as we said before, we're looking to spot trends and not details. This means we're interested in the big picture, which translates to:
- We can take out the constants. We’re looking at the big picture so O(2n) can be translated into O(n) and 0(500) can be O(1)
- Small terms are not important either. Again, think of the big picture. If we have O(n+10) we can forget about 10 and translate it into O(n).
- Arithmetic operations are constant
- Variable assignment is constant
- Accessing elements in an array by index or object is also constant
- In a loop, complexity is the length of the loop times the complexity of what’s inside of it
O(log n) refers to space complexity, which will be addressed in the second part of this article series.
Got something to add? Please feel free to reach out for any question, comment or meme.