Bisection Method β Guaranteed Numerical Root Finding
- Bisection: Intermediate Value Theorem guarantees a root when f(a)Γf(b)<0.
- Linear convergence: one correct binary digit per iteration β ~34 iterations for 10^-10 precision.
- No derivative required β works for any continuous function.
Bisection is the simplest application of the Intermediate Value Theorem: if f is continuous and f(a) < 0 and f(b) > 0, then somewhere in [a,b] the function crosses zero. The algorithm just applies this fact repeatedly, halving the interval each time.
The guaranteed convergence of bisection makes it invaluable as a safety net. Bisection cannot diverge, cannot oscillate, cannot fail as long as the initial bracket is valid. This is why Brent's method β the gold standard for numerical root finding, used in scipy.optimize.brentq β combines bisection as a fallback with Newton-Raphson's quadratic convergence: the speed of Newton when near the root, the reliability of bisection everywhere else.
Algorithm and Implementation
Invariant: f(a) and f(b) have opposite signs β root in [a,b]. Each step halves the interval.
def bisection(f, a: float, b: float, tol: float = 1e-10, max_iter: int = 100) -> float: """ Find root of f in [a,b] where f(a)*f(b) < 0. Guaranteed to converge for continuous functions. """ if f(a) * f(b) > 0: raise ValueError('f(a) and f(b) must have opposite signs') for i in range(max_iter): mid = (a + b) / 2 if abs(b - a) / 2 < tol or f(mid) == 0: print(f'Converged in {i+1} iterations, root β {mid}') return mid if f(a) * f(mid) < 0: b = mid # root in [a, mid] else: a = mid # root in [mid, b] return (a + b) / 2 import math # Find sqrt(2): f(x) = x^2 - 2 root = bisection(lambda x: x**2 - 2, 1, 2) print(f'sqrt(2) = {root}') # Find x where cos(x) = x (Dottie number) root2 = bisection(lambda x: math.cos(x) - x, 0, 1) print(f'cos(x)=x: x = {root2:.10f}')
sqrt(2) = 1.4142135623842478
Converged in 34 iterations, root β 0.7390851331710815
cos(x)=x: x = 0.7390851331
Error Analysis and Convergence
After n iterations, the interval width is (b-a)/2^n. The error is bounded by this width.
To achieve precision Ξ΅: n β₯ log2((b-a)/Ξ΅) iterations.
For [1,2] with Ξ΅=10^-10: n β₯ log2(1/10^-10) = 10Γlog2(10) β 33.2 β 34 iterations.
import math def bisection_with_trace(f, a, b, n_steps=10): fa = f(a) print(f'Step a b mid f(mid)') for i in range(n_steps): mid = (a + b) / 2 fmid = f(mid) print(f'{i+1:4} {a:.10f} {b:.10f} {mid:.10f} {fmid:+.2e}') if fa * fmid < 0: b, _ = mid, fmid else: a, fa = mid, fmid bisection_with_trace(lambda x: x**2 - 2, 1.0, 2.0, 8)
1 1.0000000000 2.0000000000 1.5000000000 +2.50e-01
2 1.0000000000 1.5000000000 1.2500000000 -4.38e-01
3 1.2500000000 1.5000000000 1.3750000000 -1.09e-01
4 1.3750000000 1.5000000000 1.4375000000 +6.64e-02
5 1.3750000000 1.4375000000 1.4062500000 -2.34e-02
6 1.4062500000 1.4375000000 1.4218750000 +2.10e-02
7 1.4062500000 1.4218750000 1.4140625000 -2.39e-04
8 1.4140625000 1.4218750000 1.4179687500 +1.04e-02
Bisection vs Newton-Raphson
| Property | Bisection | Newton-Raphson | |---|---|---| | Convergence | Linear (1 bit/step) | Quadratic (doubles digits) | | Iterations for 10^-10 | ~34 | ~5 | | Requires derivative | No | Yes | | Guaranteed convergence | Yes (given bracket) | No | | Initial requirement | Bracket [a,b] | Single guess x0 |
Best practice: Use bisection to get close (10^-3 precision), then switch to Newton-Raphson for fast final convergence. This is essentially Brent's method.
π― Key Takeaways
- Bisection: Intermediate Value Theorem guarantees a root when f(a)Γf(b)<0.
- Linear convergence: one correct binary digit per iteration β ~34 iterations for 10^-10 precision.
- No derivative required β works for any continuous function.
- Error after n steps: (b-a)/2^n β predictable and controllable.
- Brent's method (scipy.optimize.brentq) combines bisection reliability with Newton speed.
Interview Questions on This Topic
- QHow many bisection steps are needed to find a root to within 10^-6 on the interval [0, 1]?
- QWhat is the Intermediate Value Theorem and why does it guarantee bisection converges?
- QWhen would you choose bisection over Newton-Raphson?
- QWhat is linear convergence vs quadratic convergence?
Frequently Asked Questions
Can bisection find complex roots?
No β bisection requires a sign change, which doesn't apply to complex numbers. For complex roots, Newton-Raphson generalises naturally to the complex plane (Durand-Kerner method for polynomials).
Developer and founder of TheCodeForge. I built this site because I was tired of tutorials that explain what to type without explaining why it works. Every article here is written to make concepts actually click.