Divide and Conquer β Strategy, Patterns and Recurrences
- Three steps: Divide into equal subproblems, Conquer recursively (base case when trivially small), Combine β the combine step determines the overall complexity.
- Master Theorem for T(n) = aT(n/b) + O(n^c): compare log_b(a) with c. Equal β O(n^c log n). Less β O(n^c). Greater β O(n^log_b(a)).
- When D&C pays off: independent subproblems (no overlap), balanced splits (near-equal halves), efficient combine (O(n) or less). Imbalanced splits degrade to O(n^2).
Divide and conquer is the algorithmic pattern behind merge sort, binary search, fast Fourier transform, Strassen's matrix multiplication, and closest pair of points. If you understand D&C deeply, you understand why all of these algorithms work and how to design new ones.
The pattern is deceptively simple: split the problem, solve each half, combine. The complexity depends almost entirely on the combine step β and the Master Theorem gives you an algebraic shortcut to avoid solving the recurrence from scratch each time. Learning the Master Theorem once saves you dozens of recurrence derivations throughout your career.
The Three-Step Pattern
Every divide and conquer algorithm follows the same template:
- Divide: Split the problem into subproblems (usually of equal or near-equal size)
- Conquer: Solve each subproblem recursively (base case when small enough)
- Combine: Merge subproblem solutions into the overall solution
The efficiency depends critically on the combine step. Merge sort's O(n log n) comes from an O(n) combine. Binary search has O(1) combine (no merging needed).
def divide_and_conquer(problem): # Base case if is_small_enough(problem): return solve_directly(problem) # Divide subproblems = divide(problem) # Conquer solutions = [divide_and_conquer(sub) for sub in subproblems] # Combine return combine(solutions)
The Master Theorem
For recurrences of the form T(n) = aT(n/b) + f(n) where a β₯ 1, b > 1:
- Case 1: f(n) = O(n^(log_b(a) - Ξ΅)) β T(n) = Ξ(n^log_b(a))
- Case 2: f(n) = Ξ(n^log_b(a)) β T(n) = Ξ(n^log_b(a) Β· log n)
- Case 3: f(n) = Ξ©(n^(log_b(a) + Ξ΅)) β T(n) = Ξ(f(n))
Examples: - Merge sort: T(n) = 2T(n/2) + O(n) β Case 2 β O(n log n) - Binary search: T(n) = T(n/2) + O(1) β Case 2 β O(log n) - Strassen: T(n) = 7T(n/2) + O(nΒ²) β Case 1 β O(n^2.807) - Simple recursion: T(n) = 2T(n/2) + O(1) β Case 1 β O(n)
Canonical Examples
# Binary Search β O(log n) def binary_search(arr, target, lo, hi): if lo > hi: return -1 mid = (lo + hi) // 2 if arr[mid] == target: return mid if arr[mid] < target: return binary_search(arr, target, mid+1, hi) return binary_search(arr, target, lo, mid-1) # Merge Sort β O(n log n) def merge_sort(arr): if len(arr) <= 1: return arr mid = len(arr) // 2 left = merge_sort(arr[:mid]) right = merge_sort(arr[mid:]) return merge(left, right) def merge(left, right): result, i, j = [], 0, 0 while i < len(left) and j < len(right): if left[i] <= right[j]: result.append(left[i]); i += 1 else: result.append(right[j]); j += 1 return result + left[i:] + right[j:] # Power function β O(log n) def fast_power(base, exp): if exp == 0: return 1 if exp % 2 == 0: half = fast_power(base, exp // 2) return half * half return base * fast_power(base, exp - 1) print(fast_power(2, 10)) # 1024
When Divide and Conquer Pays Off
D&C pays off when: 1. Subproblems are independent (no shared state to communicate) 2. Subproblems are of equal or balanced size (imbalanced splits degrade to O(nΒ²)) 3. The combine step is efficient (O(n) or less)
D&C does NOT pay off when: - Subproblems overlap heavily (use dynamic programming instead) - The combine step is expensive (e.g., O(nΒ²) combine β O(nΒ² log n) total, worse than naive)
π― Key Takeaways
- Three steps: Divide into equal subproblems, Conquer recursively (base case when trivially small), Combine β the combine step determines the overall complexity.
- Master Theorem for T(n) = aT(n/b) + O(n^c): compare log_b(a) with c. Equal β O(n^c log n). Less β O(n^c). Greater β O(n^log_b(a)).
- When D&C pays off: independent subproblems (no overlap), balanced splits (near-equal halves), efficient combine (O(n) or less). Imbalanced splits degrade to O(n^2).
- When to use DP instead: when subproblems overlap (Fibonacci without memoisation is D&C with exponential redundancy β DP fixes this).
- The fast_power function is the clearest minimal D&C example β O(log n) vs O(n) naive. Every cryptographic system relies on fast modular exponentiation built this way.
Interview Questions on This Topic
- QWhat are the three steps of divide and conquer?
- QApply the Master Theorem to T(n) = 3T(n/3) + O(n).
- QWhat makes quick sort's worst case O(nΒ²) while merge sort is always O(n log n)?
- QWhen should you use divide and conquer vs dynamic programming?
Frequently Asked Questions
What is the difference between divide and conquer and dynamic programming?
D&C solves independent subproblems β each subproblem is solved fresh. DP solves overlapping subproblems β stores results to avoid redundant computation. Fibonacci via recursion is D&C (exponential). Fibonacci via memoisation/tabulation is DP (linear).
Developer and founder of TheCodeForge. I built this site because I was tired of tutorials that explain what to type without explaining why it works. Every article here is written to make concepts actually click.