What is O notation?
O notation describes the time an algorithm will take to complete. This is typically used to aid in measuring the efficiency of an algorithm. What do we term as efficient? How quickly an algorithm will process N number of input.
The above is sourced from https://www.bigocheatsheet.com/
The image above shows the different types of behavior we can anticipate from different algorithms. Below is a brief breakdown:
lets take n to be the size of the input:
O(1) - Constant time, algorithm will complete the same time for n number of input.
O(n) - Linear time, completion time increases as n increases.
O(log n) - Logarithmic time, completion time increases to the logarithm of n input.
O(n^2) - Quadratic time, completion time increases by the square of n input size.
O(2^n) - Exponential time, completion time increases exponentially.
O(n!) - Factorial time, steps to completion increases by the factorial of the input. n*(n-1)...*1
O(nlogn) - Loglinear time, logn operations will occur n times.
Top comments (0)