DEV Community

Cover image for Estimating the Running Times of Algorithms (Part 2)
Muyiwa Johnson
Muyiwa Johnson

Posted on

Estimating the Running Times of Algorithms (Part 2)

1.2 Estimating the Running Times of Algorithms

To estimate the Big O running time of an algorithm, here are a couple of rules of thumb:

  1. Constant Time – O(1):

    If an algorithm runs in the same amount of time regardless of how large ( n ) is, then it is O(1).

  2. Linear Time – O(n):

    A loop that runs ( n ) times contributes a factor of ( n ) to the Big O running time.

  3. Multiplicative Effect of Nested Loops:

    If loops are nested, multiply their running times.

  4. Additive Effect of Sequential Loops:

    If one loop follows another, add their running times.

  5. Logarithmic Time – O(log n):

    If the loop variable is increasing in a way such as ( i = i \times 2 ) or ( i = i / 3 ), instead of changing by a constant amount (like ( i++ ) or ( i -= 2 )), then that contributes a factor of log ( n ) to the Big O running time.

Examples of Computing Big O Running Times

Below are several examples demonstrating how to compute the Big O running time of different code segments. All examples involve working with an array, and we will assume ( n ) is the length of the array.


1. Summing the Entries in an Array

let total = 0;
for (let i = 0; i < a.length; i++) {
    total += a[i];
}
Enter fullscreen mode Exit fullscreen mode

Running Time:

This code runs in O(n) time. It is a straightforward loop that iterates through each element of the array once.


2. Nested For Loops

for (let i = 0; i < a.length; i++) {
    for (let j = 0; j < a.length; j++) {
        console.log(a[i] + a[j]);
    }
}
Enter fullscreen mode Exit fullscreen mode

Running Time:

These are two ordinary loops, each running for ( n = a.length ) steps. Since they are nested, their running times multiply, resulting in an overall running time of O(n²). For each of the ( n ) iterations of the outer loop, the inner loop runs ( n ) times.


3. Three Nested Loops

for (let i = 0; i < a.length; i++) {
    for (let j = 0; j < a.length - 1; j++) {
        for (let k = 0; k < a.length - 2; k++) {
            console.log(a[i] + a[j] * a[k]);
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Running Time:

This code runs in O(n³) time. Technically, since the second and third loops don’t run the full length of the array, we could write it as O(n(n - 1)(n - 2)). However, we focus on the most significant term, which is O(n³), as the lower-order terms become insignificant as ( n ) grows.


4. Sequential Loops

let count = 0, count2 = 0;
for (let i = 0; i < a.length; i++) {
    count++;
}
for (let i = 0; i < a.length; i++) {
    count2 += 2;
}
Enter fullscreen mode Exit fullscreen mode

Running Time:

The running time here is O(n + n) = O(2n), which simplifies to O(n) because we ignore constants in Big O notation.


5. Constant Time Operations

let w = a[0];
let z = a[a.length - 1];
console.log(w + z);
Enter fullscreen mode Exit fullscreen mode

Running Time:

The running time here is O(1). The key idea is that the running time has no dependence on ( n ). No matter how large the array is, this code will always take the same amount of time to run.


6. Loop with Constant Iterations

let c = 0;
let stop = 100;
for (let i = 0; i < stop; i++) {
    c += a[i];
}
Enter fullscreen mode Exit fullscreen mode

Running Time:

Despite the loop, this code runs in O(1) time. The loop always runs 100 times, regardless of how large the array is. Notice that a.length never appears in the code, making the running time constant.


7. Logarithmic Loop

let sum = 0;
for (let i = 1; i < a.length; i *= 2) {
    sum += a[i];
}
Enter fullscreen mode Exit fullscreen mode

Running Time:

This code runs in O(log n) time. The loop variable increases exponentially: 1, 2, 4, 8, 16, 32, 64, 128, … It takes only 10 steps to reach 1000, 20 steps to reach 1 million, and 30 steps to reach 1 billion.

The number of steps to reach ( n ) is determined by solving ( 2^x = n ), whose solution is log₂(n) (commonly written as log(n)).


8. Nested Loops with Logarithmic Inner Loop

let n = a.length;
for (let i = 0; i < n; i++) {
    let sum = 0;
    let m = n;
    while (m > 1) {
        sum += m;
        m /= 2;
    }
}
Enter fullscreen mode Exit fullscreen mode

Running Time:

This code has a running time of O(n log n). The outer loop is a standard loop contributing a factor of ( n ), while the inner while loop is logarithmic since the loop variable is halved each time. Multiplying these factors gives O(n log n).


9. Combination of Loops

let total = 0;
let i = 0;
while (i < a.length) {
    i++;
    total += a[i];
}

for (let i = 0; i < a.length / 2; i++) {
    for (let j = a.length - 1; j >= 0; j--) {
        total += a[i] * a[j];
    }
}
Enter fullscreen mode Exit fullscreen mode

Running Time:

The running time is O(n²). Here's how it's calculated:

  • The while loop runs in O(n) time.
  • The nested for loops run in O(n * (n/2)) = O(n²) time.
  • Adding these together gives O(n + n²). However, we focus on the dominant term and drop the constants, resulting in O(n²).

Summary of Rules for Estimating Big O Running Times

  1. Constant Time – O(1): Operations that do not depend on ( n ).
  2. Linear Time – O(n): Single loops that iterate ( n ) times.
  3. Logarithmic Time – O(log n): Loops that divide or multiply the loop variable by a constant factor each iteration.
  4. Quadratic Time – O(n²): Nested loops where each loop runs ( n ) times.
  5. Cubic Time – O(n³): Three nested loops each running ( n ) times.
  6. Additive and Multiplicative Effects: When loops are sequential, their Big O times add; when nested, they multiply.

I believe applying these rules of thumb and analyzing code segments as shown in the examples above, you can effectively estimate the Big O running time of various algorithms.

Feel free to comments on any issues. thank you for reading my 2 cents!!

Top comments (0)