[Math] How to proof that Newton-Raphson methods converges for some given starting point $x_0$

convergence-divergencenumerical methods

How do I proof that for some arbitrary $x_0$ Newton-Raphson method converges?
Let us consider following example:

$f(x)=x^2-1$

$x_0=2$

Newton-Raphson iterative formula is:

$x_{i+1}=x_i-\frac{f(x_i)}{f'(x_i)}$

when using f(x) yields:

$x_{i+1}=x_i-\frac{x^2_i-1}{2x_i}$

The fact that it converges is quite obvious, as for given initial values following root approximations approach the exact root value. But how do I show it in the manner:

$\lim_{i \to \infty} x_{i+1}=\zeta$

where $\zeta$ is the exact root of $f(x)$.

Best Answer

If it converges, then limit is easy to find:

$$ \lim_{i \to \infty} x_{i+1}=\lim_{i \to \infty}\left( x_i-\frac{x^2_i-1}{2x_i} \right)$$

To show that it really does converge isn't too hard, but it does require making (then proving) qualitative observation about the behavior of the sequence of $\{ x_i \}$'s.