Numerical Methods – Newton’s method and neighborhood of convergence

algebra-precalculusnewton raphsonnumerical methods

For Newton's method, my book says that for convergence, the starting point, $x^{(0)}$, must be sufficiently close to $x^*$, the actual root.

According to the following inequality, where $C = \frac{sup|f''(x)|}{inf|f'(x)|}$,
$$|x^*-x^{(n+1)}| \leq \frac{1}{2}C(x^*-x^{(n)})^2,$$

which implies that
$$\frac{1}{2}C|x^*-x^{(0)}| <1$$ needs to be satisfied for $x^{(0)}$ to be considered sufficiently close enough to $x^*$ (otherwise the distance between $x^*$ and $x^{(n+1)}$ might not decrease).

I'm having trouble understanding how to reach $\frac{1}{2}C|x^*-x^{(0)}| <1$. Why is the condition not $\frac{1}{2}C(x^*-x^{(0)})^2 <1?$

Best Answer

You have a recursive inequality of the type $$d_{k+1}\le qd_k^2.$$ You could now successively insert it into itself to find a general law, but what is easier is to just multiply with $q$ to get $$(qd_{k+1})\le (qd_k)^2.$$ This is easier to iterate, and you get $$ (qd_k)\le (qd_0)^{2^k} $$ The upper bound is part of the geometric sequence, and thus converges to zero for $|qd_0|<1$. For $|qd_0|\ge 1$ the bound diverges and thus allows no statement on the convergence of the sequence $(d_k)_{k\in\Bbb N}$.