For iterative methods, we have a fixed point formula in the form:
$$\tag 1 x_{n+1} = g(x_n)$$
The Newton iteration is given by:
$$\tag 2 \displaystyle x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}$$
So, $(2)$ is of the form $(1)$.
Since $r$ is a root of $f(x) = 0, r = g(r)$. Since $x_{n+1} = g(x_n)$, we can write:
$$x_{n+1} - r = g(x_n) - g(r).$$
Lets expand $g(x_n)$ as a Taylor series in terms of $(x_n -r)$, with the second derivative term as the remainder:
$$g(x_n) = g(r)+g'(r)(x_n-r) + \frac{g''(\xi)}{2}(x_n-r)^2$$
where $\xi$ lies in the interval from $[x_n, r]$, since:
$$g'(r) = \frac{f(r)f''(r)}{[f'(r)]^2} = 0.$$
Because $f(r) = 0$ ($r$ is a root), we have:
$$g(x_n) = g(r) + \frac{g''(\xi)}{2}(x_n-r)^2.$$
Letting $x_n-r = e_n$, we have:
$$e_{n+1} = x_{n+1}-r = g(x_n) - g(r) = \frac{g''(\xi)}{2}(e_n)^2.$$
Each successive error term is proportional to the square of the previous error, that is, Newton's method is quadratically convergent.
Multiple Root
Following the same sort of reasoning, if $x_n$ is near a root of multiplicity $\delta \ge 2$, then:
$$f(x) \approx \frac{(x-\xi)^\delta}{\delta !}f^{(m)}(\xi)$$
$$f'(x) \approx \frac{(x-\xi)^{\delta-1}}{(\delta-1) !}f^{(m)}(\xi)$$
So we have:
$$\tag 3 x_{n+1} -\xi = x_n - \xi -\frac{f(x_n)}{f'(x_n)} = \left(\frac{\delta -1}{\delta}\right)(x_n - \xi)$$
The $\delta$ term on the RHS of $(3)$ in not quadratic, hence we have linear convergence.
You should be able to use with your approach to clean up what you did.
I am confused about what you wrote after your derivation, but I am going to guess that you want to figure out the convergence rate for this $f(x)$.
We are given:
$$f(x) =x^2(x-1)$$
There are two roots to this equation at:
- $x = 0$ (a double root)
- $x = 1$ (a single root)
So, we would expect linear convergence at the double root and quadratic convergence at the single root.
The Newton iteration is given by:
$$x_{n+1} = x_n - \frac{(x_n-1) x_n^2}{x_n^2+2 (x_n-1) x_n}$$
For the first root, lets pick a starting point of $x = 0.1$, we get the following cycle:
- $24$ steps to converge to the root $x = 5.341441708552285 \times 10^{-9}$ (yikes!)
For the second root, lets pick a starting point of $x = 1.4$, we get the following cycle:
- $6$ steps to converge to the root $x = 1.000000000000000$ (much better!)
Now, you would use the exact results and compare them numerically and show the convergence rates for each of the cases.
Note: one must choose a sufficient starting point that will converge to one root or the other. Based on that initial selection, the rate is going to be quadratic when the algorithm converges to $1$ and linear when it converges to $0$. We pick a nearby starting point and see where we end up. You could also graph the function to have an idea about starting points. We typically do not know apriori what roots will give us what behavior. Obviously there is a range where convergence happens to one root or the other. I am not sure how one would calculate that analytically because you may as well figure out the roots without numerical methods in that case.
Best Answer
Let $c$ be a fixed point of a function $g$, i.e. $g(c) = c$. I will assume the definition that the iteration sequence $x_k = g^{k}(x_0)$ has order of convergence $p \geq 1$ if $$\lim_{k \to \infty}\frac{|x_{k + 1} - c|}{|x_{k} - c|^p} = C$$ for some $C \neq 0$. If $p = 1$ it is required that $C < 1$. Of course definitions vary from place to place, so adjust accordingly. In your example, $$\lim_{k \to \infty}\frac{|x_{k + 1} - 0|}{|x_{k} - 0|} = \frac{1}{2} < 1$$ so the convergence is linear.
Here is a rather simple test for convergence rate:
Suppose you already know $x_k$ converges to $c$ and you want to find the order of convergence. Let $p$ be the smallest positive integer for which $g^{(p)}(c) \neq 0$. Then by the Lagrange remainder theorem, $$x_{k + 1} = g(x_k) = g(c) + \frac{g^{(p)}(\xi_k)}{p!}(x_k - c)^p = c + \frac{g^{(p)}(\xi_k)}{p!}(x_k - c)^p.$$ where $\xi_k$ is between $c$ and $x_k$. By the squeeze theorem, $\lim_{k \to \infty}\xi_k = c$. Hence assuming $g^{(p)}$ is continuous at $c$, $$\lim_{k \to \infty}\frac{|x_{k + 1} - c|}{|x_{k} - c|^p} = \frac{|g^{(p)}(c)|}{p!}$$ so the convergence is of order $p$, except that if $p = 1$ it is also necessary that $|g'(c)| < 1$.
Whether or not the iteration converges, and what fixed point it converges to depends on the function $g$ and the starting point $x_0$.
Edit: To find a solution $c$ to $f(c) = 0$, Newton's method constructs a function $$g(x) = x - \frac{f(x)}{f'(x)}$$ and tries to find a fixed point of $g$ via the iteration $x_k = g^{k}(x_0)$ since $g(c) = c \iff f(c) = 0$. So $$g'(x) = \frac{f(x)f''(x)}{f'(x)^2},$$ $$g''(x) = \dots$$ etc. You can verify that if $c$ is a root of $f$ of multiplicity $m$, that is, $$f(x) = (x - c)^{m}h(x)$$ for some function $h$ with $h(c) \neq 0$, then $g'(c) = 1 - \frac{1}{m}$ and thus the convergence is only linear when $m > 1$.