DEV Community

Kartik Sharma
Kartik Sharma

Posted on • Updated on

Day 2: Common Time Complexities

Deep Dive into Common Time Complexities

Time complexity gives an idea of how an algorithm's runtime increases as the input size increases. Let’s break down the most common time complexities and analyze examples for each.


1. O(1) – Constant Time Complexity

An algorithm with O(1) time complexity takes a constant amount of time to execute, regardless of the input size.

Example:

Accessing an element in an array by its index:

let arr = [5, 3, 8, 6];
console.log(arr[2]); // O(1)
Enter fullscreen mode Exit fullscreen mode

Explanation:

  • Accessing any element in an array by index is constant-time because the operation doesn't depend on the number of elements in the array.

2. O(log n) – Logarithmic Time Complexity

Logarithmic time complexity often arises in algorithms that divide the problem into smaller sub-problems at each step. One common example is binary search.

Example:

Binary search in a sorted array:

function binarySearch(arr, target) {
  let left = 0;
  let right = arr.length - 1;

  while (left <= right) {
    let mid = Math.floor((left + right) / 2);

    if (arr[mid] === target) {
      return mid;
    } else if (arr[mid] < target) {
      left = mid + 1;
    } else {
      right = mid - 1;
    }
  }
  return -1;
}
Enter fullscreen mode Exit fullscreen mode

Explanation:

  • Each time we search, the array size is halved. So, the time complexity grows logarithmically with the size of the input array.

3. O(n) – Linear Time Complexity

In linear time, the runtime grows in direct proportion to the input size. If the input size doubles, the runtime doubles.

Example:

Looping through all elements in an array:

function printAll(arr) {
  for (let i = 0; i < arr.length; i++) {
    console.log(arr[i]);
  }
}
Enter fullscreen mode Exit fullscreen mode

Explanation:

  • The algorithm goes through each element once, so the runtime increases linearly with the number of elements (n).

4. O(n log n) – Linearithmic Time Complexity

O(n log n) is a combination of linear and logarithmic time complexities. This complexity commonly appears in efficient sorting algorithms like Merge Sort and Quick Sort.

Example:

Merge Sort (Divide and Conquer):

function mergeSort(arr) {
  if (arr.length <= 1) return arr;

  const mid = Math.floor(arr.length / 2);
  const left = mergeSort(arr.slice(0, mid));
  const right = mergeSort(arr.slice(mid));

  return merge(left, right);
}

function merge(left, right) {
  let result = [];
  let leftIndex = 0;
  let rightIndex = 0;

  while (leftIndex < left.length && rightIndex < right.length) {
    if (left[leftIndex] < right[rightIndex]) {
      result.push(left[leftIndex]);
      leftIndex++;
    } else {
      result.push(right[rightIndex]);
      rightIndex++;
    }
  }
  return result.concat(left.slice(leftIndex)).concat(right.slice(rightIndex));
}
Enter fullscreen mode Exit fullscreen mode

Explanation:

  • The array is split in half in each recursive call (logarithmic division), and each split involves merging all elements, which is linear, resulting in O(n log n).

5. O(n²) – Quadratic Time Complexity

Quadratic time complexity occurs when there are nested loops where each iteration goes over n elements. As the input size increases, the runtime grows quadratically.

Example:

Bubble Sort (Brute force sorting algorithm):

function bubbleSort(arr) {
  let n = arr.length;
  for (let i = 0; i < n - 1; i++) {
    for (let j = 0; j < n - i - 1; j++) {
      if (arr[j] > arr[j + 1]) {
        // Swap elements
        let temp = arr[j];
        arr[j] = arr[j + 1];
        arr[j + 1] = temp;
      }
    }
  }
  return arr;
}
Enter fullscreen mode Exit fullscreen mode

Explanation:

  • The nested loops both iterate through n elements, so the time complexity is O(n²). For larger input sizes, this becomes inefficient.

6. O(2ⁿ) – Exponential Time Complexity

Exponential time complexity occurs when the algorithm’s growth doubles with each additional element in the input data. Algorithms with this time complexity become impractical for large inputs.

Example:

Solving the Tower of Hanoi problem (recursive solution):

function towerOfHanoi(n, from_rod, to_rod, aux_rod) {
  if (n === 1) {
    console.log(`Move disk 1 from rod ${from_rod} to rod ${to_rod}`);
    return;
  }
  towerOfHanoi(n - 1, from_rod, aux_rod, to_rod);
  console.log(`Move disk ${n} from rod ${from_rod} to rod ${to_rod}`);
  towerOfHanoi(n - 1, aux_rod, to_rod, from_rod);
}
Enter fullscreen mode Exit fullscreen mode

Explanation:

  • Each recursive call leads to two more recursive calls, resulting in a doubling of work at each step, leading to an exponential growth rate O(2ⁿ).

Introduction to Space Complexity

Space complexity refers to the amount of memory an algorithm uses as it runs. Like time complexity, it helps us understand how efficiently an algorithm scales with input size.

Key Points:

  1. Auxiliary space: Memory used by an algorithm apart from the input size. For example, temporary variables, recursive call stack, etc.
  2. Input space: Memory used to store the input data.

Differences Between Time and Space Complexity

Aspect Time Complexity Space Complexity
Definition Amount of time an algorithm takes to execute Amount of memory an algorithm uses
Purpose Focuses on the runtime performance Focuses on memory usage
Measured by Number of operations or steps Amount of memory used (in bytes, variables)
Example (Merge Sort) O(n log n) O(n) due to auxiliary arrays in merge process
Primary Use Case Evaluating performance for large inputs Checking memory efficiency for large datasets

Example: Time vs Space Complexity in Merge Sort

  • Time Complexity: O(n log n) due to recursive splitting and merging.
  • Space Complexity: O(n) because it needs extra memory to hold the left and right subarrays during the merge phase.

3. Sorting Algorithms: Implementation and Analysis

1. Bubble Sort (O(n²))

Bubble Sort repeatedly swaps adjacent elements if they are in the wrong order. It has a time complexity of O(n²) because of its nested loops.

function bubbleSort(arr) {
  let n = arr.length;
  for (let i = 0; i < n - 1; i++) {
    for (let j = 0; j < n - i - 1; j++) {
      if (arr[j] > arr[j + 1]) {
        let temp = arr[j];
        arr[j] = arr[j + 1];
        arr[j + 1] = temp;
      }
    }
  }
  return arr;
}
Enter fullscreen mode Exit fullscreen mode
  • Time Complexity: O(n²)
  • Space Complexity: O(1) (no extra space is needed beyond input array)

2. Selection Sort (O(n²))

Selection Sort repeatedly finds the minimum element from the unsorted part and puts it at the beginning.

function selectionSort(arr) {
  let n = arr.length;
  for (let i = 0; i < n - 1; i++) {
    let minIdx = i;
    for (let j = i + 1; j < n; j++) {
      if (arr[j] < arr[minIdx]) {
        minIdx = j;
      }
    }
    let temp = arr[i];
    arr[i] = arr[minIdx];
    arr[minIdx] = temp;
  }
  return arr;
}
Enter fullscreen mode Exit fullscreen mode
  • Time Complexity: O(n²)
  • Space Complexity: O(1) (in-place sorting)

3. Insertion Sort (O(n²))

Insertion Sort builds the sorted array one element at a time by comparing the current element with the already sorted part of the array.

function insertionSort(arr) {
  let n = arr.length;
  for (let i = 1; i < n; i++) {
    let key = arr[i];
    let j = i - 1;
    while (j >= 0 && arr[j] > key) {
      arr[j + 1] = arr[j];
      j = j - 1;
    }
    arr[j + 1] = key;
  }
  return arr;
}
Enter fullscreen mode Exit fullscreen mode
  • Time Complexity: O(n²)
  • Space Complexity: O(1) (in-place sorting)

4. Complete Online Quiz on Time Complexities

To reinforce your learning, try taking online quizzes to identify time complexities for various algorithms. You can look for quizzes on:

  • HackerRank
  • LeetCode
  • GeeksforGeeks

These platforms provide real coding problems and quizzes that help you practice and solidify your understanding of time complexities.


Summary:

  • Time complexities range from O(1) (constant) to O(2ⁿ) (exponential), with linear and quadratic complexities being the most commonly encountered.
  • Space complexity measures the memory requirements of algorithms and is crucial in applications with limited memory.
  • Sorting algorithms like bubble, selection, and insertion sort have a quadratic time complexity (O(n²)), making them inefficient for large inputs.

Top comments (0)