You already know recursion. Here's the one idea that unlocks everything else.
1 — The Gap Nobody Talks About
You understand recursion. You've done tree traversals, written Fibonacci, maybe even memoized things. Recursive functions feel natural — break the problem down, solve the smaller version, build back up.
But then someone says "generate all permutations" or "find all valid partitions" and something shifts. You know the shape of the answer vaguely. You sit down to write it. And the code doesn't come.
It's not that you're missing a data structure or a formula. The gap is conceptual — and it's smaller than you think.
Recursion computes. Backtracking explores. Pure recursion commits to a path and returns a value. Backtracking explores a path, and if it doesn't work out, undoes the choice and tries another. That undo step is the entire difference. Once you internalize it, the code starts writing itself.
To make the idea clearer: imagine you're in a maze. The goal isn't to escape, it's to find a cat stranded somewhere inside. This subtle shift in goals probably made you think "cool, I'll just explore all the paths then." That's the crux of backtracking vs recursion. Now extend the example: what if you had to give me the exact sequence of turns for every possible path to the cat?
2 — How to Know You're Looking at a Backtracking Problem
Before writing a single line of code, you need to recognize what kind of problem you're facing.
Backtracking signals:
- The problem asks for all solutions, not just one
- You're building something incrementally and need to abandon partial builds
- There's a constraint that can fail mid-way through construction
- Keywords: generate all, find all permutations, all valid combinations, all ways to partition
Pure recursion signals:
- You need a single answer
- Clear top-down decomposition — the answer is defined in terms of the same function on a smaller input
- Nothing is shared between branches, no exploration of alternatives
Key question to ask yourself: Am I computing a value — or am I exploring a space of possibilities? If you're exploring, you almost certainly need backtracking.
Remember the maze — if you just needed the cat's coordinates, you'd stop at the first path that works. But returning all possible paths means exploring every route — and when a path leads nowhere, you backtrack to the last crossroads and try another branch.
3 — The Anatomy of Backtracking
Every backtracking solution follows the same skeleton:
function backtrack(state, choices):
// base case: we've built a complete valid state
if state is complete:
collect or return state
for each choice in choices:
make choice // modify state
backtrack(updated state, remaining choices)
undo choice // ← restore state — this is backtracking
Three steps repeat for every choice: make, recurse, undo. The undo step is what separates backtracking from plain recursion. Without it, you aren't exploring alternatives — you're contaminating future branches with the choices of past ones.
What "make" and "undo" look like concretely depends on the problem — adding and removing from a list, marking and unmarking a cell — but the pattern is always the same.
4 — Recursion vs Backtracking — Side by Side
The best way to understand backtracking isn't to study it in isolation — it's to watch pure recursion fail at a problem that needs it, then see exactly how backtracking fixes the breakage.
Example 1 — Pure recursion succeeds: Count all subsets
How many subsets does [1, 2, 3] have? For each element, we either include it or we don't:
function countSubsets(index):
if index == length:
return 1
return countSubsets(index + 1) // exclude
+ countSubsets(index + 1) // include

Notice what we're not doing. We're not building anything. No shared state between calls. Each recursive call lives in its own world and returns a number upward. Both branches are completely independent — they never interfere with each other.
When recursion works cleanly, it's usually because each call is stateless — it receives inputs, does work, returns a value. Nothing is shared.
Example 2 — Pure recursion breaks: Generate all subsets
Now change one thing. Instead of counting, collect the actual subsets:
current = []
result = []
function findSubsets(index):
if index == length:
result.add(current) // feels right. it isn't.
return
current.add(nums[index])
findSubsets(index + 1) // "include" branch
findSubsets(index + 1) // "exclude" branch — but current still has nums[index]!
Let's trace through what actually happens for [1, 2]:
findSubsets(0)
→ add 1. current = [1]
→ findSubsets(1) // include branch
→ add 2. current = [1, 2]
→ base case: result = [[1, 2]] ✓
→ findSubsets(2) // "exclude 2" branch
→ base case: current is STILL [1, 2]
result = [[1, 2], [1, 2]] ✗
→ findSubsets(1) // "exclude 1" branch
→ current is STILL [1, 2] from before...

The "exclude" branch never actually excluded anything. It inherited the dirty state left behind by the "include" branch. Both branches share the same current, and there's no mechanism to clean it up between them.
The diagnosis: The recursion structure is correct. The problem is shared mutable state. Pure recursion has no mechanism to undo that.
Think of it this way — pure recursion is like each call working on its own notepad. The moment you introduce a shared structure like current, you've moved to a shared whiteboard. Every call reads and writes to the same surface. Without a rule that says "erase before the next person writes", it's chaos. Backtracking is that rule.
The fix — backtracking enters
function backtrack(index, current):
if index == length:
result.add(copy of current) // snapshot, not reference
return
// include branch
current.add(nums[index])
backtrack(index + 1, current)
current.remove(last) // ← undo — clean the whiteboard
// exclude branch — current is clean now
backtrack(index + 1, current)
One line — current.remove(last) — is the entire difference. It erases the contamination before the exclude branch runs. Both branches now see a clean state.
Notice also that we collect a copy of current, not current itself. If you're wondering why — hold that thought. It's one of the most common bugs in the pitfalls section.
5 — The Two Flavors
Not all backtracking problems have the same shape. Once you've recognised that a problem needs backtracking, the next question is: what kind?
Picking from a pool (permutations / combinations) — you have a set of items and you're choosing from them. Tracking what's been used is the key mechanic — visited array or a start index that prevents reuse.
Splitting a structure (slicing / partitioning) — you have one string or array and you're cutting it at different points. A moving start index is the key mechanic — you never "undo" the pointer, you just try different cut points from the same position.
[Expand with side-by-side pseudocode skeletons showing the structural difference between the two]
6 — Common Mistakes and Pitfalls
These are the bugs that appear even when you understand backtracking conceptually. Each one is subtle enough to survive a quick read of the code.
1 — Not restoring state (the bleed bug)
You add to current inside the loop but forget to remove afterward. The state from one branch bleeds into the next. Every subsequent branch starts dirty.
The fix: every make step must have a matching undo step — always, unconditionally.
for each choice:
current.add(choice)
backtrack(...)
current.remove(last) // never skip this
2 — The copy trap
You collect current directly into result instead of a snapshot. Since current keeps mutating, every entry in result ends up pointing to the same final state — a list full of identical, wrong results.
result.add(current) // all entries become identical
result.add(copy of current) // correct — snapshot at this moment
3 — Confusing permutation structure with partition structure
Permutation problems track which items have been used (visited array or used set). Partition problems track where you are in the structure (a start index). Using the wrong mechanic gives you either wrong results or runaway recursion.
Ask yourself: am I picking from a pool, or am I cutting a single sequence?
4 — Forgetting the base case is a collection step
In problems that ask for all solutions, the base case isn't just a termination signal — it's where you capture the result. Returning early or returning a boolean and forgetting to collect means valid solutions silently disappear.
5 — Off-by-one in slicing
In partitioning problems, the start index or slice boundary is off by one. You either process an empty prefix (producing invalid empty partitions) or skip the last valid cut point. Trace through the first and last iterations by hand before trusting the loop bounds.
7 — A Mental Checklist
- Does the problem ask for all solutions, not just one? → likely backtracking
- Is there an undo step? → backtracking. No undo → pure recursion.
- Am I picking from a pool or cutting a structure? → determines the flavor
- Am I collecting all results or stopping at first? → determines base case behavior
- Did I copy before collecting? → prevents the copy trap
- Does every make step have a matching undo? → prevents the bleed bug
Top comments (0)