DEV Community

Huy Do
Huy Do

Posted on

Big O Notation

Time-Complexity

Big-O notation is a common means of describing the performance or complexity of an algorithm in Computer Science. Time-Complexity will become an important aspect of your dev environment as your continue to progress through your studies. You will analyze your program and denote which programs fall within what time complexity (constant, linear, quadratic, exponential, logarithmic).

O(1)- Constant Time:

Constant Time Complexity describes an algorithm that will always execute in the same time (or space) regardless of the size of the input data set. In JavaScript, this can be as simple as accessing a specific index within an array:

let array = [1, 2, 3, 4, 5];
array[0] // This is a constant time look-up
Enter fullscreen mode Exit fullscreen mode
const removeLastElement = function(array) {
  let numberOfElementsToRemove = 2;
  while (numberOfElementsToRemove > 0) {
    array.pop();
  }
}; 
//This will also have a constant time look-up since the function is 
//only looking at a specific reference point within the array.
Enter fullscreen mode Exit fullscreen mode

It doesn't matter the size of the array. Looking up a specific location of array would take the same amount of time. Therefore the function has a constant time look-up.

O(N)-Linear Time:

Linear Time Complexity describes an algorithm or program who’s complexity will grow in direct proportion to the size of the input data.

const numberRange = function(array) {
  for (let i = 1; i < array.length; i++) {
    console.log(array[i]);
  }
}; 
//This will have a linear time look-up since the function is
//looking at a every index in the array once.
Enter fullscreen mode Exit fullscreen mode

In this example, the look-up time is directly related to the size of our input because we will be going through each item within the array. In other words, the larger the input, the greater the amount of time it takes to perform the function. Of course, if the array only had 1 index (i.e. array.length === 1), then the function would have a constant time look-up.

O(log(n)) — Logarithmic Time:

Logarithmic Time Complexity refers to an algorithm that runs in proportionally to the logarithm of the input size.

const partialRange = function(array) {
  for (let i = 1; i < array.length; i = i * 2) {
    console.log(array[i]);
  }
}; 
//This will have a logarithmic time look-up since the function is
 //looking at only searching through part of the array.
Enter fullscreen mode Exit fullscreen mode

We are logging numbers up until the array length is passed. The function will only be logging numbers at a certain interval. We will be repeating a constant time look up on multiple parts of the array, the function will not have constant time look up. Also, the function doesn't pass through every index in the array, therefore it will not have a linear time look up.

O(N²) — Quadratic Time:

Quadratic Time Complexity represents an algorithm whose performance is directly proportional to the squared size of the input data set (think of Linear but squared). This time complexity has happened when we nest over multiple iterations within the data sets.

const hasDuplicates = function(array) {
  for (let i = 0; i < array.length; i++) {
    let item = array[i];
    if (array.slice(i + 1).indexOf(item) !== -1) {
      return true;
    }
  }
  return false;
}; 
//This will have a quadratic time look-up since the function is
//looking at every index in the array twice.
   console.log(hasDuplicates([2,4,2,3]))
Enter fullscreen mode Exit fullscreen mode

It’s clear in the first part that our function will be searching through the array at least once, but the difference-maker lies in the if statement. It is here where we are performing another look-up of the array with the native indexOf method. In doing so, we are passing over the same array twice with each search, producing a Quadratic Time Complexity.

O(2^N) — Exponential Time

Exponential Time complexity denotes an algorithm whose growth doubles with each addition to the input data set. If you know of other exponential growth patterns, this works in much the same way. The time complexity starts off very shallow, rising at an ever-increasing rate until the end.

const fibonacci = function(number) {
  if (number <= 1) return number;
  return fibonacci(number - 2) + fibonacci(number - 1);
}; //This will have an exponential time look-up since the function 
   //is looking at a every index an exponential number of times.

console.log(fibonacci(4))  //3
Enter fullscreen mode Exit fullscreen mode

Fibonacci numbers are a great way to practice your understanding of recursion. Although there is a way to make the fibonacci function have a shorter time complexity, for this case we will be using the double recursive method to show how it has an Exponential Time Complexity. In a few words, the recursive instantiation will pass over every number from the number to zero. It will do this twice (once for each recursive call) until it reaches the base case if statement.

References:
https://medium.com/@ariel.salem1989/an-easy-to-use-guide-to-big-o-time-complexity-5dcf4be8a444
https://www.interviewcake.com/article/python/big-o-notation-time-and-space-complexity?
https://www.bigocheatsheet.com

Top comments (0)