Hi again!
Below are clear, beginner-friendly step-by-step examples for how to calculate common Big-O classes, using Go code snippets.
O(n) - Linear time
Example: linear search (find a value in an array).
// Linear search: O(n)
func linearSearch(nums []int, target int) int {
for i:= 0 ; i < len(nums) ; i++ { // runs up to n times
if nums[i] == target { // constant-time comparison each iteration
return i
}
}
return -1
}
Step-by-step reasoning
- n is the size of input which is length of
nums
. - In the worst case (target not present or last element), the
for
loop executes n iterations. - Each iteration (inside the loop) does O(1) work (compare and maybe assign).
- O notation is O(1*n) = O(n)
O(log n) - Logarithmic time
Example: binary search in a sorted slice.
// Binary search: O(log n)
func binarySearch(nums []int, target int) int {
lo, hi := 0, len(nums)-1
for lo <= hi {
mid := lo + (hi-lo)/2 // cut range in half
if nums[mid] == target {
return mid
} else if nums[mid] < target {
lo = mid + 1
} else {
hi = mid - 1
}
}
return -1
}
Step-by-step reasoning
- Each loop iteration halves the search range.
- After k iterations the remaining range size is about n / 2^k.
- It stops when n / 2^k ≤ 1 → 2^k ≥ n → k ≥ log₂(n).
- So number of iterations ~ ⌈log₂ n⌉ → O(log n).
- Note: base of the logarithm doesn't matter for Big-O (log₂, log₁₀, ln are all Θ(log n)).
O(n log n) - Typical of efficient sorts
Example: merge sort (divide & conquer).
// Merge sort: O(n log n)
func mergeSort(a []int) []int {
if len(a) <= 1 {
return a
}
mid := len(a) / 2
left := mergeSort(a[:mid])
right := mergeSort(a[mid:])
return merge(left, right)
}
func merge(left, right []int) []int {
res := make([]int, 0, len(left)+len(right))
i, j := 0, 0
for i < len(left) && j < len(right) {
if left[i] <= right[j] {
res = append(res, left[i])
i++
} else {
res = append(res, right[j])
j++
}
}
// append remaining
res = append(res, left[i:]...)
res = append(res, right[j:]...)
return res
}
Step-by-step reasoning
- Recurrence: T(n) = 2·T(n/2) + O(n) (because we sort two halves then merge in O(n)).
-
Using Master Theorem or level-by-level cost:
- Level 0 (root): cost O(n) to merge at that level,
- Level 1: two merges of size n/2 → total O(n),
- Level 2: four merges of size n/4 → total O(n),
- ... there are about log₂(n) levels.
Total ≈ n * (log₂(n) + 1) → O(n log n).
O(n²) - Quadratic time
Example: printing every pair (nested loops), or naive pairwise compare.
// Print every ordered pair: O(n^2)
func printPairs(nums []int) {
n := len(nums)
for i := 0; i < n; i++ {
for j := 0; j < n; j++ {
// constant-time work inside
_ = nums[i] + nums[j] // pretend we do O(1) work
}
}
}
Step-by-step reasoning
- Outer loop runs n times.
- For every outer iteration, inner loop runs n times.
- Total iterations = n * n = n².
- Drop constants and lower-order terms → O(n²).
O(2ⁿ) - Exponential time (base 2)
Example: generate all subsets (power set). Each element either included or excluded → 2 choices per element.
// Generate all subsets: O(2^n)
func subsets(nums []int) [][]int {
res := [][]int{}
var helper func(int, []int)
helper = func(i int, cur []int) {
if i == len(nums) {
tmp := append([]int(nil), cur...)
res = append(res, tmp)
return
}
// exclude current
helper(i+1, cur)
// include current
helper(i+1, append(cur, nums[i]))
}
helper(0, []int{})
return res
}
Step-by-step reasoning
- At each index i you branch into two calls (include / exclude).
- Tree depth = n, branching factor = 2 → number of leaves = 2ⁿ.
- You do at least one constant amount of work per leaf (plus cost to build each subset).
- Total work ≈ c · 2ⁿ → O(2ⁿ).
- Note: Some recursive exponential algorithms (like naive Fibonacci) produce growth ~φⁿ where φ ≈ 1.618; often people use O(2ⁿ) to describe general exponential growth class.
O(n!) - Factorial time
Example: generate all permutations of n distinct elements.
// Generate all permutations: O(n!)
func permutations(nums []int) [][]int {
a := make([]int, len(nums))
copy(a, nums)
res := [][]int{}
var helper func(int)
helper = func(k int) {
if k == len(a)-1 {
tmp := append([]int(nil), a...)
res = append(res, tmp)
return
}
for i := k; i < len(a); i++ {
a[k], a[i] = a[i], a[k] // swap
helper(k + 1)
a[k], a[i] = a[i], a[k] // swap back
}
}
helper(0)
return res
}
Step-by-step reasoning
- Number of permutations of n distinct items is n! = n × (n−1) × (n−2) × ... × 1.
- The recursion produces all n! permutations.
- If you also consider the cost to copy/print each permutation (of length n) then total time is O(n · n!) because for each of the n! permutations we may spend O(n) to copy or output it.
- But the combinatorial explosion is already n!, so algorithms with factorial behavior are impractical even for small n (n = 10 → 3,628,800 permutations).
O(1) - Constant Time
Example: Get the first element of a slice
// O(1) - accessing the first element
func getFirst(nums []int) int {
if len(nums) == 0 {
return -1
}
return nums[0] // single operation
}
Step-by-step reasoning
- Let
n = len(nums)
. - The function performs:
- One check:
len(nums) == 0
→ O(1) - One index access:
nums[0]
→ O(1) - One return → O(1)
- Total operations = 3 (or some small constant c). Does NOT depend on
n
. - Therefore, runtime = O(1) — it stays constant even if
nums
has 1 element or 1 million elements.
- Total operations = 3 (or some small constant c). Does NOT depend on
Key takeaway: Constant time algorithms do a fixed number of steps regardless of input size.
Quick rules / takeaways (for beginners)
- Drop constants: c·n → O(n). 5n² → O(n²).
- Drop lower-order terms: (n² + n)/2 → O(n²).
- Log base doesn't matter: log₂ n, log₁₀ n, ln n are all Θ(log n).
- Multiplicative rule: nested loops (n × n) → O(n²).
- Additive rule: if you do O(n) then O(n²), total is O(n + n²) → O(n²) (keep the dominant term).
- Recursive divide & conquer often gives O(n log n) or O(log n) depending on splitting and merging (use Master Theorem or count cost per level).
- Output-size matters: generating all combinations/subsets/permutations means runtime grows at least as fast as the number of outputs (2ⁿ, n!, ...).
Top comments (0)