Backtracking is DFS over a decision tree, with state mutation on entry and rollback on exit. The shape recurs: pick a candidate, mutate state, recurse, undo, try next.
The skeleton
def backtrack(state):
if is_terminal(state):
record(state)
return
for choice in candidates(state):
if not viable(choice, state):
continue
apply(choice, state)
backtrack(state)
undo(choice, state)
Three ingredients: a terminal predicate (when to record), a candidate generator (the branching factor), and pruning (skip impossible subtrees as early as possible).
When it’s the right tool
- Enumerate all subsets, permutations, combinations, partitions.
- Constraint satisfaction: N-queens, Sudoku, word search on a grid.
- Find any / find best under combinatorial constraints, when the search space has obvious pruning.
It’s O(b^d) worst case (branching b, depth d) — the only thing standing between you and exponential pain is pruning.
Canonical decision shapes
| Problem family | Decision per node |
|---|---|
| Subsets | Include nums[i] or skip; recurse i+1 |
| Combination sum (with reuse) | Take nums[i] (stay at i) or skip (go i+1) |
| Permutations | Pick any unused element; mark used, recurse, unmark |
| Partitioning | Split at position j > i; recurse on suffix |
| Grid search (word) | Try each direction; mark visited, recurse, unmark |
| N-queens | Place in row r, column c; track column / diagonals |
For “subsets with duplicates”: sort first, then skip duplicates by advancing past equal elements when backtracking from the include branch — not when entering.
Pruning: the only thing that matters at scale
- Sort candidates so that infeasible branches are detected early (e.g. combination sum: stop expanding when
nums[i] > remaining). - Track running sums / bounds so you don’t have to recompute on each terminal check.
- Symmetry breaking — don’t enumerate equivalent solutions twice.
For N-queens specifically, three sets are the trick: occupied columns, occupied positive diagonals (r+c), occupied negative diagonals (r-c). Each check is O(1).
When to walk away
If your branching factor is high and you can’t prune, this is the wrong algorithm. Look for:
- Memoizable substructure → DP.
- Greedy choice property → greedy.
- Polynomial reduction → flow, matching, LP.
Backtracking is the fallback when none of those fit.