Note: Frequency map of characters is O(1) space — alphabet size is fixed (26 letters max).
Linked Lists
Operation
Time
Space
Access by index
O(n)
O(1)
Search
O(n)
O(1)
Insert at head
O(1)
O(1)
Insert at tail (no tail ptr)
O(n)
O(1)
Insert at tail (with tail ptr)
O(1)
O(1)
Delete at head
O(1)
O(1)
Delete at tail
O(n)
O(1)
Reverse
O(n)
O(1)
Detect cycle
O(n)
O(1)
Find middle
O(n)
O(1)
Merge two sorted lists
O(n+m)
O(1)
Recursion
Time = depth × work per level
Space = depth (call stack frames only)
Branches affect TIME not space.
Space = depth regardless of branches.
Pattern
Time
Space
Linear — subtract 1 each call
O(n)
O(n)
Halving — divide by 2 each call
O(log n)
O(log n)
2 branches, depth n
O(2ⁿ)
O(n)
3 branches, depth n
O(3ⁿ)
O(n)
2 branches, halving input
O(n)
O(log n)
Merge sort (2 branches, halving, O(n) work/level)
O(n log n)
O(n)
Fast power (halving exponent)
O(log n)
O(log n)
Key Rule: Subtracting 1 → O(n) depth. Dividing by k → O(log n) depth.
Sorting Algorithms
Algorithm
Best
Average
Worst
Space
Bubble Sort
O(n)
O(n²)
O(n²)
O(1)
Selection Sort
O(n²)
O(n²)
O(n²)
O(1)
Insertion Sort
O(n)
O(n²)
O(n²)
O(1)
Merge Sort
O(n log n)
O(n log n)
O(n log n)
O(n)
Quick Sort
O(n log n)
O(n log n)
O(n²)
O(log n)
Heap Sort
O(n log n)
O(n log n)
O(n log n)
O(1)
Tim Sort (JS default)
O(n)
O(n log n)
O(n log n)
O(n)
Trees
h = height
Balanced tree → h = log n
Skewed tree → h = n
Always state both cases in interviews.
Operation
Time
Space
DFS traversal (any)
O(n)
O(h)
BFS level order
O(n)
O(n)
BST search (balanced)
O(log n)
O(log n)
BST search (skewed)
O(n)
O(n)
BST insert (balanced)
O(log n)
O(log n)
BST insert (skewed)
O(n)
O(n)
Max depth
O(n)
O(h)
Lowest common ancestor
O(n)
O(h)
BFS space = O(n) because last level holds ~n/2 nodes (widest point). DFS space = O(h) because only one root→leaf path is on stack at once.
Heap (Priority Queue)
Always balanced → height = log n
Insert → add at bottom → bubble UP → O(log n)
Remove → swap root with last → bubble DOWN → O(log n)
Operation
Time
Space
Insert
O(log n)
O(1)
Remove top (min/max)
O(log n)
O(1)
Peek top
O(1)
O(1)
Search
O(n)
O(1)
Build heap from array
O(n)
O(1)
Top K elements
O(n log k)
O(k)
Top K pattern: Keep heap of size k → each insert/remove is O(log k) not O(log n).
Graphs
V = number of vertices (nodes)
E = number of edges
Storage
Structure
Space
Check edge
Get neighbors
Adjacency List
O(V+E)
O(degree)
O(degree)
Adjacency Matrix
O(V²)
O(1)
O(V)
Traversal & Algorithms
Algorithm
Time
Space
DFS
O(V+E)
O(V)
BFS
O(V+E)
O(V)
Grid DFS/BFS (m×n grid)
O(m×n)
O(m×n)
Cycle detection (Union Find)
O(E)
O(V)
Connected components (Union Find)
O(E)
O(V)
Dijkstra (min heap)
O(E log V)
O(E)
Union Find (Disjoint Set)
Operation
Time
Space
find (with path compression)
O(α(n)) ≈ O(1)
O(V)
union (with rank)
O(α(n)) ≈ O(1)
O(V)
α(n) = inverse Ackermann — practically constant for all real inputs.
Graph traversal rule: Time = O(V+E) because every node visited once (V) and every edge processed once (E).
Data Structures — Quick Reference
Structure
Access
Search
Insert
Delete
Space
Array
O(1)
O(n)
O(n)
O(n)
O(n)
Linked List
O(n)
O(n)
O(1) head
O(1) head
O(n)
Hash Map
O(1) avg
O(1) avg
O(1) avg
O(1) avg
O(n)
BST (balanced)
O(log n)
O(log n)
O(log n)
O(log n)
O(n)
Heap
—
O(n)
O(log n)
O(log n)
O(n)
Stack / Queue
O(n)
O(n)
O(1)
O(1)
O(n)
Dynamic Programming
DP = Recursion + Memoization (top-down) OR Iteration + Tabulation (bottom-up)
Time = number of unique subproblems × work per subproblem
Space = number of unique subproblems (memo/dp table)
Pattern
Time
Space
Space (optimized)
1D DP (fibonacci, climbing stairs)
O(n)
O(n)
O(1)
2D DP (grid, strings)
O(n×m)
O(n×m)
O(n) or O(m)
Knapsack (0/1)
O(n×W)
O(n×W)
O(W)
Longest Common Subsequence
O(n×m)
O(n×m)
O(min(n,m))
Longest Increasing Subsequence
O(n²)
O(n)
—
LIS (binary search)
O(n log n)
O(n)
—
Matrix chain / interval DP
O(n³)
O(n²)
—
Coin change
O(n×amount)
O(amount)
—
Key Rule: Without memoization, exponential. With memoization, each subproblem solved once → polynomial. Space optimization: 2D DP can often use a single row/column if only previous row is needed.
Backtracking
Backtracking = DFS on decision tree + pruning
Time = O(branches^depth) — worst case explores all paths
Space = O(depth) — call stack + current path
Problem
Time
Space
Subsets (power set)
O(2ⁿ)
O(n)
Permutations
O(n!)
O(n)
Combinations (nCk)
O(2ⁿ)
O(n)
N-Queens
O(n!)
O(n)
Sudoku solver
O(9^(n²))
O(n²)
Word search in grid
O(m×n×4^L)
O(L)
L = word length, n = board size Pruning reduces actual runtime significantly but worst case stays the same.
Two Pointers & Sliding Window
Pattern
Time
Space
Two pointers (sorted array)
O(n)
O(1)
Two pointers (linked list)
O(n)
O(1)
Sliding window (fixed size)
O(n)
O(1)
Sliding window (variable size)
O(n)
O(k)
Sliding window (with freq map)
O(n)
O(k)
Key Rule: Sliding window converts O(n²) brute force to O(n) by avoiding recomputation.
Binary Search
Always applied on SORTED input or monotonic condition.
Cuts search space in HALF each step → O(log n).
Pattern
Time
Space
Classic binary search
O(log n)
O(1)
Binary search (recursive)
O(log n)
O(log n)
Search in rotated array
O(log n)
O(1)
Binary search on answer
O(log n × f(n))
O(1)
Find peak element
O(log n)
O(1)
Binary search on answer: When you binary search on the result range, multiply by cost of checking each candidate f(n).
Stack & Queue
Operation
Time
Space
Push / Pop / Peek (stack)
O(1)
O(n)
Enqueue / Dequeue (queue)
O(1)
O(n)
Monotonic stack (next greater)
O(n) amortized
O(n)
Valid parentheses
O(n)
O(n)
Min stack (get min in O(1))
O(1) per op
O(n)
Largest rectangle in histogram
O(n)
O(n)
Monotonic stack: Each element pushed/popped at most once → O(n) total even with inner loop.
Trie (Prefix Tree)
Each node = one character
n = number of words, L = average word length
Operation
Time
Space
Insert word
O(L)
O(n×L) total
Search word
O(L)
O(1)
Search prefix
O(L)
O(1)
Delete word
O(L)
O(1)
Bit Manipulation
Operation
Time
Space
Check bit i
O(1)
O(1)
Set / clear / toggle bit
O(1)
O(1)
Count set bits (naive)
O(log n)
O(1)
Count set bits (Brian Kernighan)
O(set bits)
O(1)
XOR all elements (find unique)
O(n)
O(1)
Power of 2 check (n & n-1 == 0)
O(1)
O(1)
Mental Model — Quick Lookup
Halving input each step? → O(log n)
One loop? → O(n)
Loop inside loop? → O(n²)
Branching recursion (2 branches)? → O(2ⁿ) time, O(n) space
Hash map lookup? → O(1)
Sorting? → at best O(n log n)
Tree/graph traversal? → O(n) or O(V+E)
Heap insert/remove? → O(log n)
Top K with heap? → O(n log k)
Dijkstra? → O(E log V)
DP (unique subproblems)? → O(states × work per state)
Backtracking (permutations)? → O(n!)
Backtracking (subsets)? → O(2ⁿ)
Sliding window? → O(n)
Monotonic stack? → O(n) amortized
Trie insert/search? → O(L) where L = word length
Top comments (0)
Subscribe
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Top comments (0)