[Math] Convergence of newton’s method with cube roots

convergence-divergencenewton raphsonnumerical methodsroots

How can you prove that Newton's method for estimating the cube root converges when $x_0$ is not $0$? Where sequence $\{x_{n}\}$ can be defined with the relation $x_{n+1} = \frac{2}{3}x_n + \frac{1}{3}\frac{z}{x_n^2} $.

I tried to figure out a way to write this as another formula to be proved by induction that it is monotonic and bounded, but I have no idea how to do that. I tried writing out the terms, and cannot figure out a formula for that.

Edit: z is the term of which you are finding the cube root.

Best Answer

Sketch: suppose $z \neq 0$, $r=z^{1/3}$. Let $g(x)=x-\frac{x^3-z}{3x^2}$. We have two cases:

If $x>r$ then $x>g(x)>r$. This implies the recursive sequence $g^n(x)$ is bounded and monotone, so it is convergent. The limit of a recursive sequence defined by a continuous mapping must be a fixed point of the mapping, and the only fixed point of $g$ is $r$.

If $x<r$ then it is true that $x<g(x)$ (so that you move toward the root). But in general it is not true that $g(x)<r$, so the sequence is no longer guaranteed to be monotone. So the error may not go to zero monotonically, either. As a workaround, you can show that eventually $g^n(x)>r$, at which point the previous case applies.

To make this work out, you will need to additionally assume that $g^n(x)$ is never exactly zero, which rules out a countable set of initial values. (Thankfully, all of these points, except for $0$ itself, have the opposite sign from $r$, so there is no reason you should ever encounter them.)

Related Question