DEV Community

Cover image for Big O Basics
Danielle Ellis
Danielle Ellis

Posted on

Big O Basics

The code we choose to use can impact the speed and the performance of our program. How would we know which algorithm is most efficient? Big O Notation is used in Computer Science and measures how quickly the runtime of an algorithm based on the number of input in a function.

Big O looks at the worst case scenario or the max number of steps to take in a problem. On the other hand, Big Omega looks at the best case scenario or the least number of steps to take in a problem.

Common Runtimes from least to greatest effectiveness:

  • O(n^2): Quadratic time - as (n) grows, runtime squares.
  • O(n): Linear - as (n) scales, runtime scales.
  • O(log n): Logarithmic time - halves dataset until it finds (n).
  • O(1): Constant - as (n) grows, there is no impact.

Big O Complexity chart

Big ) Chart

This chart shows the runtime with green shaded area being the most effective to the red shaded areas being the least effective.

Top comments (1)

thedanielleellis profile image
Danielle Ellis