Backtracking Explained: How to Solve Problems by Undoing Bad Choices
- Backtracking = incremental building + feasibility check + undo. Remove any one of those three and you no longer have backtracking.
- The efficiency of backtracking lives entirely in the quality of your pruning. A tight feasibility check that eliminates branches early is worth more than any micro-optimization elsewhere.
- Always snapshot mutable state before storing it as a result — storing a reference to a live ArrayList or StringBuilder will corrupt all your stored solutions silently.
Imagine you're navigating a maze. You walk down a corridor and hit a dead end — so you turn around, go back to the last fork, and try a different path. That 'turn around and try again' move is exactly what backtracking is. It's a systematic way of exploring all possible options by trying a choice, checking if it still makes sense, and undoing it if it doesn't — rather than blindly trying every combination from scratch.
Most real-world problems don't come with an obvious answer. Finding all valid ways to place chess queens on a board, generating every possible password combination, or solving a Sudoku puzzle — these all share one thing: there are many possible paths, and most of them are dead ends. Backtracking is the algorithmic strategy that lets you explore those paths efficiently without manually tracking which ones you've already failed at.
In this guide, we'll break down exactly what Backtracking is, why it was designed this way, and how to use it correctly in real projects by implementing production-grade search algorithms.
By the end you'll have both the conceptual understanding and practical code examples to use Backtracking with confidence.
What Backtracking Actually Does (and Why Brute Force Isn't Enough)
Brute force tries every possible combination regardless of whether it's already heading toward failure. Backtracking is smarter — it builds a solution incrementally, and the moment a partial solution violates a constraint, it stops pursuing that branch entirely. This is called 'pruning.'
Think of a decision tree. Every node is a choice. Brute force visits every single node. Backtracking visits a node, checks 'can this possibly lead to a valid answer?', and if the answer is no, it skips the entire subtree rooted there. That's the efficiency win.
The core loop is always the same: choose a candidate, place it, recurse deeper, then unchoose (undo the placement). That 'unchoose' step is what makes backtracking different from plain recursion — you're restoring state so the next candidate gets a clean slate to work with.
This pattern works beautifully for constraint satisfaction problems: puzzles, permutations, combinations, and graph coloring — anywhere the solution space is large but constraints filter most of it out early.
package io.thecodeforge.algorithm; import java.util.ArrayList; import java.util.List; import java.util.Arrays; /** * io.thecodeforge implementation for generating power sets. */ public class SubsetGenerator { public static void main(String[] args) { int[] numbers = {1, 2, 3}; List<List<Integer>> allSubsets = new ArrayList<>(); generateSubsets(numbers, 0, new ArrayList<>(), allSubsets); System.out.println("All subsets of " + Arrays.toString(numbers) + ":"); allSubsets.forEach(System.out::println); System.out.println("Total subsets: " + allSubsets.size()); } private static void generateSubsets( int[] numbers, int startIndex, List<Integer> currentSubset, List<List<Integer>> results) { // Record a snapshot — O(N) copy results.add(new ArrayList<>(currentSubset)); for (int i = startIndex; i < numbers.length; i++) { // CHOOSE currentSubset.add(numbers[i]); // EXPLORE generateSubsets(numbers, i + 1, currentSubset, results); // UNCHOOSE (Backtrack) currentSubset.remove(currentSubset.size() - 1); } } }
[]
[1]
[1, 2]
[1, 2, 3]
[1, 3]
[2]
[2, 3]
[3]
Total subsets: 8
Pruning: The Feature That Makes Backtracking Fast
Generating all subsets is pure exploration — there are no invalid paths to prune. But backtracking really earns its keep when constraints let you eliminate huge branches early.
The N-Queens problem is the classic demonstration. Place N queens on an N×N chessboard so that no two queens attack each other (same row, column, or diagonal). A brute force approach would try all N^N placements. With pruning, the moment placing a queen creates a conflict, you stop and backtrack — you never explore any of the millions of board states that would follow from that illegal placement.
For an 8×8 board, brute force checks 16 million+ configurations. Backtracking with pruning checks roughly 15,000. That's the power of cutting entire subtrees.
The constraint check — the 'is this placement still valid?' question — is called the bounding function or feasibility check. Writing a tight, fast feasibility check is the single biggest lever you have for making backtracking solutions performant.
package io.thecodeforge.algorithm; import java.util.ArrayList; import java.util.List; public class NQueensSolver { public static void main(String[] args) { int n = 4; List<int[]> solutions = new ArrayList<>(); solveNQueens(n, 0, new int[n], solutions); System.out.println("Found " + solutions.size() + " solutions for " + n + "-Queens."); solutions.forEach(sol -> printBoard(sol, n)); } private static void solveNQueens(int n, int col, int[] queenInRow, List<int[]> solutions) { if (col == n) { solutions.add(queenInRow.clone()); return; } for (int row = 0; row < n; row++) { if (isPlacementSafe(queenInRow, col, row)) { queenInRow[col] = row; // CHOOSE solveNQueens(n, col + 1, queenInRow, solutions); // EXPLORE // UNCHOOSE is implicit via row overwrite } } } private static boolean isPlacementSafe(int[] queenInRow, int col, int row) { for (int prevCol = 0; prevCol < col; prevCol++) { int prevRow = queenInRow[prevCol]; if (prevRow == row || Math.abs(prevRow - row) == Math.abs(prevCol - col)) { return false; // PRUNE branch } } return true; } private static void printBoard(int[] queenInRow, int n) { for (int i = 0; i < n; i++) { for (int j = 0; j < n; j++) { System.out.print(queenInRow[j] == i ? " Q " : " . "); } System.out.println(); } System.out.println(); } }
. Q . .
. . . Q
Q . . .
. . Q .
. . Q .
Q . . .
. . . Q
. Q . .
Recognizing When Backtracking Is the Right Tool
Backtracking isn't always the right choice. It's best when the problem has all three of these properties:
- You need all solutions (or need to verify one exists), not just a single optimal one.
- The solution is built incrementally — you make a series of choices.
- Constraints can eliminate branches early — otherwise, you're just doing brute force.
Classic backtracking problems: N-Queens, Sudoku, generating permutations/combinations/subsets, word search on a grid, graph coloring, and the rat-in-a-maze problem.
Not a great fit for backtracking: finding the shortest path (use BFS/Dijkstra), problems with overlapping subproblems (use DP), or cases where you need statistical probability across all paths (use DP again).
package io.thecodeforge.algorithm; import java.util.ArrayList; import java.util.List; public class PermutationGenerator { public static void main(String[] args) { String input = "ABC"; List<String> results = new ArrayList<>(); boolean[] used = new boolean[input.length()]; backtrack(input.toCharArray(), used, new StringBuilder(), results); results.forEach(System.out::println); } private static void backtrack(char[] chars, boolean[] used, StringBuilder sb, List<String> results) { if (sb.length() == chars.length) { results.add(sb.toString()); return; } for (int i = 0; i < chars.length; i++) { if (used[i]) continue; // PRUNE used[i] = true; // CHOOSE sb.append(chars[i]); backtrack(chars, used, sb, results); // EXPLORE sb.deleteCharAt(sb.length() - 1); // UNCHOOSE used[i] = false; } } }
ACB
BAC
BCA
CAB
CBA
currentPermutation.toString() to snapshot the result, not just add the StringBuilder directly. If you added the StringBuilder object itself, every result in your list would point to the same object — and every subsequent modification would corrupt all previously stored results. Always snapshot mutable state before storing it.| Aspect | Backtracking | Brute Force |
|---|---|---|
| Explores invalid branches? | No — prunes them early | Yes — tries everything |
| Time complexity | Depends on pruning quality | Always worst-case O(N^N) or similar |
| Memory usage | O(depth of recursion) | O(total candidates) |
| Best for | Constraint satisfaction, all-solutions problems | Tiny input, no constraints |
| Needs feasibility check? | Yes — core to its efficiency | No |
| Typical problems | N-Queens, Sudoku, permutations | Password cracking (no constraints) |
| Difficulty to implement | Moderate — state management is tricky | Simple — nested loops |
🎯 Key Takeaways
- Backtracking = incremental building + feasibility check + undo. Remove any one of those three and you no longer have backtracking.
- The efficiency of backtracking lives entirely in the quality of your pruning. A tight feasibility check that eliminates branches early is worth more than any micro-optimization elsewhere.
- Always snapshot mutable state before storing it as a result — storing a reference to a live ArrayList or StringBuilder will corrupt all your stored solutions silently.
- Backtracking shines when you need all valid solutions and constraints exist to prune the search space. For a single optimal solution, think greedy or DP first.
⚠ Common Mistakes to Avoid
Interview Questions on This Topic
- QWhat is the difference between backtracking and dynamic programming — when would you choose one over the other?
- QWalk me through how you'd solve Sudoku using backtracking. What does your feasibility check look like, and how do you ensure efficient pruning?
- QIf your backtracking solution is too slow for large inputs, what are three concrete techniques you can apply to speed it up without changing the core algorithm?
Frequently Asked Questions
Is backtracking the same as recursion?
No — recursion is a programming technique (a function calling itself), while backtracking is an algorithmic strategy that uses recursion as its vehicle. The defining feature of backtracking is the explicit undo step: after recursing, you restore state so the next candidate starts clean. Plain recursion doesn't require that.
What is the time complexity of backtracking?
It depends heavily on the problem and the quality of your pruning. In the worst case (no pruning at all) it's equivalent to brute force — O(N!) for permutations, O(2^N) for subsets. In practice, good pruning can reduce the explored space by orders of magnitude, which is why backtracking is practical for problems where brute force is not.
How do I know when to stop recursing and record a result?
Your base case defines a 'complete' solution — typically when you've placed all items, filled all positions, or exhausted the input. For problems like subset generation, every state (not just leaf nodes) is a valid result, so you record at every level. For problems like N-Queens, you only record when all N queens are placed without conflict. Understanding exactly what 'done' means for your problem is the first thing to nail down before writing a single line of backtracking code.
Developer and founder of TheCodeForge. I built this site because I was tired of tutorials that explain what to type without explaining why it works. Every article here is written to make concepts actually click.