How fast algorithms grow
Day 111 of 149
👉 Full deep-dive with code examples
The Cooking Analogy
Imagine cooking dinner:
- Cook for 1 person: about the same amount of time
- Cook for 2 people: still about the same (make a bigger portion)
- Cook for 100 people: Now you need hours!
Some tasks barely change with more data. Others explode.
Time Complexity measures how your code slows down as data grows.
Why It Matters
Two programs can solve the same problem, but:
- Program A: about the same time when you double the work
- Program B: noticeably slower when you double the work
Small difference now, huge difference at scale!
The Big-O Notation
We write time complexity with "Big-O":
-
O(1) - Constant: Time doesn’t grow with input size (for large enough inputs)
- Like getting the first item from a list
-
O(n) - Linear: Time grows with data size
- Like reading every item in a list
-
O(n²) - Quadratic: Time grows by the square
- Like comparing every item to every other item
-
O(log n) - Logarithmic: Time grows slowly even for huge data
- Like binary search in a sorted list (split the search space in half each time)
A Simple Visual
100 items:
O(1): ~1 operation
O(log n): ~7 operations
O(n): 100 operations
O(n²): 10,000 operations
(These numbers are just to build intuition — real runtimes depend on constants, hardware, and the exact algorithm.)
See how O(n²) explodes? That's why it matters!
The Rule of Thumb
- O(1) → Excellent
- O(log n) → Great
- O(n) → Good for most cases
- O(n²) → Slow for large data
- O(2ⁿ) → Usually too slow
In One Sentence
Time Complexity tells you how much slower your code gets as the amount of data increases.
🔗 Enjoying these? Follow for daily ELI5 explanations!
Making complex tech concepts simple, one day at a time.
Top comments (0)