Here is one approach to understanding Newton's method that generalizes easily to your situation. Let $n(x)=x-f(x)/f'(x)$. Newton's sequence approximating a root is defined recursively by $x_{k+1}=x_k-f(x_k)/f'(x_k)$ or $x_{k+1}=n(x_k)$. Thus, if $c$ is a root of $f$, we are interested in the difference $n(x)-c$, when $x$ is close to $c$. This can be estimated using a series expansion of $n$ about $c$:
$$n(x) \approx c+\frac{f''\left(c\right)}{2
f'\left(c\right)}\left(x-c\right){}^2+O\left(\left(x-c\right){}^3\right).$$
From here, it's pretty easy to see that the difference between $n(x)$ and $c$ is proportional to $(x-c)^2$, i.e. you expect quadratic convergence of the sequence defined recursively by $x_k=n(x_{k-1})$.
Now, try the same thing with $n(x)=x-f(x)/f'(x_0)$, where $x_0$ is the fixed first term in your sequence. Again, expanding $n$ about $c$, where $f(c)=0$, we get
$$n(x) \approx c+\left(1-\frac{f'(c)}{f'\left(x_0\right)}\right)(x-c) - \frac{f''(c)}{2 f'\left(x_0\right)}(x-c)^2 +O\left((x-c)^3\right).$$
Quadratic convergence is now lost due to the first order term.
We can use this series to help formulate some concrete examples. The critical issue is the absolute value of $1-f'(c)/f'(x_0)$:
- If $|1-f'(c)/f'(x_0)|>1$, we have divergence,
- If $0<|1-f'(c)/f'(x_0)|<1$, we have linear convergence,
- If $1-f'(c)/f'(x_0) = 0$, we have quadratic convergence.
Of course, those statements all assume that $x_0$ is sufficiently close to $c$.
Now, consider examples of the form $f(x)=x^2-c^2$, which has a root at $x=c$. Then, our series expansion becomes
$$n(x) \approx c + \left(1-\frac{c}{x_0}\right)(x-c) + \frac{1}{2x_0}(x-c)^2+
O\left((x-c)^3\right).$$
Comment: In fact, the second order approximation is exact for this family of functions, but that's not necessary for this approximation technique to work.
It's now very easy to produce specific types of behavior in this family. Whenever, $0<c<x_0$, for example, we have $0<1-c/x_0<1$ so we are guaranteed linear convergence. Even more specifically, if $c=2$ and $x_0=4$, then $1-c/x_0 = 1/2$ and this modified method generates a sequence whose difference from the root $c=2$ is cut about in half with each iterate. On the other hand, if $c=4$ and $x_0=1$, then $1-c/x_0=-3$ and we'll generate a divergent sequence.
Finally, consider $f(x)=2 x - x^3 + x^5$. Note that $c=0$ is a root of $f$ and that $f'(c)=2$. Furthermore, $f'(\sqrt{3/5})=2$. Thus, if we start this modified Newton's method at $x_0=\sqrt{3/5}$, then we might expect quadratic convergence. In fact, we get even better as $n(x) = (x^3-x^5)/2$ and any $x_0$ in $[0,1]$ leads to a sequence with cubic convergence.
Congratulation, you have reinvented the Durand-Kerner method (1960) that was developed by Weierstraß in 1891.
It has quite satisfying convergence, but may show a rather slow initial phase until the root approximations are all close enough to the exact roots. (Once, there were Java applets, but since they are now virtually unusable, and even sometimes no longer to be found, that demonstrate that nicely, especially for symmetric configurations of roots and initial values.)
Best Answer
Sketch: suppose $z \neq 0$, $r=z^{1/3}$. Let $g(x)=x-\frac{x^3-z}{3x^2}$. We have two cases:
If $x>r$ then $x>g(x)>r$. This implies the recursive sequence $g^n(x)$ is bounded and monotone, so it is convergent. The limit of a recursive sequence defined by a continuous mapping must be a fixed point of the mapping, and the only fixed point of $g$ is $r$.
If $x<r$ then it is true that $x<g(x)$ (so that you move toward the root). But in general it is not true that $g(x)<r$, so the sequence is no longer guaranteed to be monotone. So the error may not go to zero monotonically, either. As a workaround, you can show that eventually $g^n(x)>r$, at which point the previous case applies.
To make this work out, you will need to additionally assume that $g^n(x)$ is never exactly zero, which rules out a countable set of initial values. (Thankfully, all of these points, except for $0$ itself, have the opposite sign from $r$, so there is no reason you should ever encounter them.)