[Math] the Chord Method does not appear to be converging under the same cond as Newton’s Method

convergence-divergencenumerical methods

The Chord Method is:

$x^{(k+1)} = x^{(k)} – {g(x^{(k)}) \over g'(x^{(0)})}$

The question is to compute the cube root of 2, using the Chord Method.
Carry out the first few iterations, using x(0) = 0.6.

$g(x) = x^3 -2$

$g'(x) = 3x^2$

$x^{(1)} = 0.6 – {(0.6)^3 -2 \over 3(0.6)^2}$

$x^{(1)} = 0.6 – {(0.6)^3 -2 \over 1.08 }$

$x^{(1)} = 2.2518$

$x^{(2)} = 2.2518 – {(2.2518)^3 -2 \over 1.08 }$

$x^{(2)} = -6.4685$

$x^{(3)} = -6.4685 – {(-6.4685)^3 -2 \over 1.08 }$

$x^{(3)} = 245.9868$

And then things get really really bad.

The problem is when usign Newton's Method it converges to a value of around ~1.2599; and

"The two methods converge under essentially the same conditions."

According to my notes. So shouldn't it also converge using the Chord Method as it does using Newton's? Am I missing something?

Best Answer

For solving $g(x)=0$, the iterates generated by the chord method are $$x_{k+1} = x_k - {g(x_k) \over g'(x_0)}$$ while those given by Newton method are $$x_{k+1} = x_k - {g(x_k) \over g'(x_k)}$$ so the two methods will behave the same (or almost the same) if $g'(x_k) \approx g'(x_0)$ (that is to say when $g(x)$ is almost linear from $x_0$ to the solution) or when the solution is close to $x_0$.

If you want to use the chord method (what I should almost never suggest), what you can do is to use it for $2$ or $3$ iterations (so we have $x_0,x_1,x_2$) and repeat the process defining the new $x_0$ as being equal to the last $x_2$.

I any manner, when it works, the chord method as a linear convergence while Newton method has a quadratic convergence (which means that it will require much less iterations).

If you really do not want to update the derivative, I suppose that you could approximate it by finite difference using $$g'(x_k)\approx\frac{g(x_k)-g(x_{k-1})}{x_k-x_{k-1}}$$