Homeβ€Ί DSAβ€Ί Bisection Method β€” Guaranteed Numerical Root Finding

Bisection Method β€” Guaranteed Numerical Root Finding

Where developers are forged. Β· Structured learning Β· Free forever.
πŸ“ Part of: Numerical Analysis β†’ Topic 2 of 3
Learn the bisection method for finding roots of continuous functions.
πŸ§‘β€πŸ’» Beginner-friendly β€” no prior DSA experience needed
In this tutorial, you'll learn:
  • Bisection: Intermediate Value Theorem guarantees a root when f(a)Γ—f(b)<0.
  • Linear convergence: one correct binary digit per iteration β€” ~34 iterations for 10^-10 precision.
  • No derivative required β€” works for any continuous function.
✦ Plain-English analogy ✦ Real code with output ✦ Interview questions
⚑ Quick Answer
If you know a function is negative at point a and positive at point b, it must cross zero somewhere between them β€” the Intermediate Value Theorem. Bisection exploits this: check the midpoint. Is it negative or positive? The root is in one half β€” discard the other. Repeat, halving the search interval each time. Slow but bulletproof β€” it always finds the root.

Bisection is the simplest application of the Intermediate Value Theorem: if f is continuous and f(a) < 0 and f(b) > 0, then somewhere in [a,b] the function crosses zero. The algorithm just applies this fact repeatedly, halving the interval each time.

The guaranteed convergence of bisection makes it invaluable as a safety net. Bisection cannot diverge, cannot oscillate, cannot fail as long as the initial bracket is valid. This is why Brent's method β€” the gold standard for numerical root finding, used in scipy.optimize.brentq β€” combines bisection as a fallback with Newton-Raphson's quadratic convergence: the speed of Newton when near the root, the reliability of bisection everywhere else.

Algorithm and Implementation

Invariant: f(a) and f(b) have opposite signs β†’ root in [a,b]. Each step halves the interval.

bisection.py Β· PYTHON
1234567891011121314151617181920212223242526
def bisection(f, a: float, b: float, tol: float = 1e-10, max_iter: int = 100) -> float:
    """
    Find root of f in [a,b] where f(a)*f(b) < 0.
    Guaranteed to converge for continuous functions.
    """
    if f(a) * f(b) > 0:
        raise ValueError('f(a) and f(b) must have opposite signs')
    for i in range(max_iter):
        mid = (a + b) / 2
        if abs(b - a) / 2 < tol or f(mid) == 0:
            print(f'Converged in {i+1} iterations, root β‰ˆ {mid}')
            return mid
        if f(a) * f(mid) < 0:
            b = mid  # root in [a, mid]
        else:
            a = mid  # root in [mid, b]
    return (a + b) / 2

import math
# Find sqrt(2): f(x) = x^2 - 2
root = bisection(lambda x: x**2 - 2, 1, 2)
print(f'sqrt(2) = {root}')

# Find x where cos(x) = x (Dottie number)
root2 = bisection(lambda x: math.cos(x) - x, 0, 1)
print(f'cos(x)=x: x = {root2:.10f}')
β–Ά Output
Converged in 34 iterations, root β‰ˆ 1.4142135623842478
sqrt(2) = 1.4142135623842478
Converged in 34 iterations, root β‰ˆ 0.7390851331710815
cos(x)=x: x = 0.7390851331

Error Analysis and Convergence

After n iterations, the interval width is (b-a)/2^n. The error is bounded by this width.

To achieve precision Ξ΅: n β‰₯ log2((b-a)/Ξ΅) iterations.

For [1,2] with Ξ΅=10^-10: n β‰₯ log2(1/10^-10) = 10Γ—log2(10) β‰ˆ 33.2 β†’ 34 iterations.

bisection_error.py Β· PYTHON
123456789101112131415
import math

def bisection_with_trace(f, a, b, n_steps=10):
    fa = f(a)
    print(f'Step  a              b              mid            f(mid)')
    for i in range(n_steps):
        mid = (a + b) / 2
        fmid = f(mid)
        print(f'{i+1:4}  {a:.10f}  {b:.10f}  {mid:.10f}  {fmid:+.2e}')
        if fa * fmid < 0:
            b, _ = mid, fmid
        else:
            a, fa = mid, fmid

bisection_with_trace(lambda x: x**2 - 2, 1.0, 2.0, 8)
β–Ά Output
Step a b mid f(mid)
1 1.0000000000 2.0000000000 1.5000000000 +2.50e-01
2 1.0000000000 1.5000000000 1.2500000000 -4.38e-01
3 1.2500000000 1.5000000000 1.3750000000 -1.09e-01
4 1.3750000000 1.5000000000 1.4375000000 +6.64e-02
5 1.3750000000 1.4375000000 1.4062500000 -2.34e-02
6 1.4062500000 1.4375000000 1.4218750000 +2.10e-02
7 1.4062500000 1.4218750000 1.4140625000 -2.39e-04
8 1.4140625000 1.4218750000 1.4179687500 +1.04e-02

Bisection vs Newton-Raphson

| Property | Bisection | Newton-Raphson | |---|---|---| | Convergence | Linear (1 bit/step) | Quadratic (doubles digits) | | Iterations for 10^-10 | ~34 | ~5 | | Requires derivative | No | Yes | | Guaranteed convergence | Yes (given bracket) | No | | Initial requirement | Bracket [a,b] | Single guess x0 |

Best practice: Use bisection to get close (10^-3 precision), then switch to Newton-Raphson for fast final convergence. This is essentially Brent's method.

πŸ”₯
scipy.optimizescipy.optimize.brentq implements Brent's method β€” bisection + Newton-Raphson hybrid. It is guaranteed to converge and fast. Always prefer it over implementing bisection manually in production Python code.

🎯 Key Takeaways

  • Bisection: Intermediate Value Theorem guarantees a root when f(a)Γ—f(b)<0.
  • Linear convergence: one correct binary digit per iteration β€” ~34 iterations for 10^-10 precision.
  • No derivative required β€” works for any continuous function.
  • Error after n steps: (b-a)/2^n β€” predictable and controllable.
  • Brent's method (scipy.optimize.brentq) combines bisection reliability with Newton speed.

Interview Questions on This Topic

  • QHow many bisection steps are needed to find a root to within 10^-6 on the interval [0, 1]?
  • QWhat is the Intermediate Value Theorem and why does it guarantee bisection converges?
  • QWhen would you choose bisection over Newton-Raphson?
  • QWhat is linear convergence vs quadratic convergence?

Frequently Asked Questions

Can bisection find complex roots?

No β€” bisection requires a sign change, which doesn't apply to complex numbers. For complex roots, Newton-Raphson generalises naturally to the complex plane (Durand-Kerner method for polynomials).

πŸ”₯
Naren Founder & Author

Developer and founder of TheCodeForge. I built this site because I was tired of tutorials that explain what to type without explaining why it works. Every article here is written to make concepts actually click.

← PreviousNewton-Raphson Method β€” Root FindingNext β†’Numerical Integration β€” Simpson's Rule and Trapezoidal
Forged with πŸ”₯ at TheCodeForge.io β€” Where Developers Are Forged