[Math] How many steps of bisection method are needed to obtain certain error

computer scienceconvergence-divergencenumerical methods

If $a = 0.1$ and $b = 1.0$, how many steps of the bisection method are
needed to determine the root with an error of at most $1/2 × 10^{−8}$ ?

How can I calculate this is I have no function to determine the value of f(c)?

Best Answer

Let $a_n$ and $b_n$ be the left and right endpoints of the interval after $n\geq 0$ steps of bisection, where initially, $a_0 = a = 0.1$ and $b_0 = b = 1$. Since the interval width is halved after each step, it follows that $b_n - a_n = (b-a)/2^n = 0.9/2^n$. Now suppose you take the midpoint of $[a_n,b_n]$ as your approximation to the true root $x$. Then,

$$ \left|x - \frac{a_n +b_n}{2}\right| \leq \frac{b_n - a_n}{2} \leq \frac{0.9}{2^{n+1}} $$

Thus, in order for the error to be at most $1/2\times 10^{-8}$ it will take at least

$$ n \geq \lceil\log_2(1.8\times 10^8) - 1\rceil = 27 $$

steps of bisection.