When it comes to writing efficient code, understanding how your algorithms perform as the input size grows is crucial. This is where time complexity and Big-O notation come in.
In this blog, we’ll break down the most common time complexities from constant time to factorial time and explain why they matter for every programmer aiming to write fast, efficient code.
What is “O” in Big-O notation?
The “O” stands for “Order of”, as in the order of growth of an algorithm. It’s part of Big-O notation, which is used to describe the upper bound (worst-case) growth rate of an algorithm’s time or space requirements as input size increases.
Computing the time complexity of code involves analyzing how the running time grows with the input size (typically denoted as n).
Here’s a structured way to approach it:
Step 01:
Identify the input Figure out what constitutes the input and denote the size of the input with a variable, commonly n.
function sumArray(arr) {
let sum = 0;
for (let i = 0; i < arr.length; i++) {
sum += arr[i];
}
}
Here, n = arr.length
Step 02:
Count the operations inside loops Focus on the most repeated operations:
Simple for loop:
for (let i = 0; i < n; i++) { ... }
→ O(n)
Nested loops:
for (let i = 0; i < n; i++) {
for (let j = 0; j < n; j++) { ... }
}
→ O(n²)
Loop with input halving:
while (low <= high) {
mid = (low + high) / 2;
...
}
→ O(log n)
Recursive calls:
function factorial(n) {
if (n === 0) return 1;
return n * factorial(n - 1);
}
→ O(n log n)
Step 03: Ignore constants and lower-order terms
- If something is O(3n + 10) → it’s O(n)
- If it’s O(n² + n) → it’s O(n²)
Here’s the Big-O time complexities ordered from fastest to slowest (i.e., least to most complex) as input size n increases:
1️⃣ O(1) → Constant time Fastest and doesn’t depend on input size
2️⃣ O(log n) → Logarithmic time Very efficient
3️⃣ O(n) → Linear time Grows directly with input size
4️⃣ O(n log n) → Linearithmic time Slightly slower than linear
5️⃣ O(n²) → Quadratic time Slows down fast
6️⃣ O(n³) → Cubic time Very slow
7️⃣ O(2ⁿ) → Exponential time Grows extremely fast so not practical for large n
8️⃣ O(n!) → Factorial time Worst and blows up even for small n 📌
Final Ordered List: O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(n³) < O(2ⁿ) < O(n!)
Understanding time complexity is essential for writing efficient and scalable code. By using Big-O notation, we can analyze how an algorithm’s performance changes with input size, allowing us to make informed decisions about which approach to use.
As you continue to code, keep these time complexities in mind; they’re not just theoretical as well practical tools for better problem-solving.
Stay updated with the latest insights and tutorials by following me on Medium, dev.io and LinkedIn. For any inquiries for games or questions, feel free to reach out to me via email. I’m here to assist you with any queries you may have!
Don’t miss out on future articles and development tips!
Top comments (0)