An equation f(x)=0 has 2 roots in the interval between 0 and 8. What is the value or a description for which the Newton Raphson method will fail to converge?
[Math] At what point does the Newton Raphson method fail to converge
numerical methods
Related Solutions
To visualize geometrically what's going on, I will code an interactive diagram with GNU Dr. Geo (free software of mine) from where I can drag the initial value (the red dot) and see how the method converge or not.
For example $x \rightarrow cos x + x$, comes with some mines, but not $x \rightarrow cos x +1.1x$.
When you get close to a flat area, the tangent sends you far away, even further than your initial value.
Your best option is to get close to the root in an area without nul value of the derivative. I guess it is the tough part as it depends on the function. Visualizing can help.
This article explains how to code with Pharo+DrGeo this interactive diagram and the link with the Hero method from the classical period.
The usual method is slow for double roots because of the following Taylor series argument. Assume $f$ is twice continuously differentiable and $r$ is a single root, i.e. $f(r)=0$ and $f'(r) \neq 0$. Then
\begin{eqnarray*}\left ( x - \frac{f(x)}{f'(x)} \right ) - r & = & x - r - \left ( x - r + \frac{f''(r)}{2 f'(r)} (x-r)^2 + o((x-r)^2) \right ) \\ & = & \frac{f''(r)}{2 f'(r)} (x-r)^2 + o((x-r)^2) \end{eqnarray*}
What I did here was Taylor expand $\frac{f(x)}{f'(x)}$ to second order about $x=r$, and then substitute in the fact that $f(r)=0$, which cancels a lot of terms. The result means that when $r$ is a single root, the method converges quadratically. Roughly speaking this means that the number of correct digits double at each step, once the error is small enough. On the other hand, if it is a double root (i.e. $f(r)=f'(r)=0$ and $f''(r) \neq 0$), we have a different situation. Here we cannot expand $f(x)/f'(x)$ about $x=r$ because this function is not defined there. Instead we must do the following:
\begin{eqnarray*}\left ( x - \frac{f(x)}{f'(x)} \right ) - r & = & x-r - \frac{1/2 f''(r) (x-r)^2 + o((x-r)^2)}{f''(r)(x-r) + o(x-r)} \\ & = & x-r -\frac{1/2 (x-r) + o(x-r)}{1 + o(1)} \\ & = & x-r - (1/2 (x-r) + o(x-r))(1+o(1)) \\ & = & 1/2 (x-r) + o(x-r) \end{eqnarray*}
This means the method converges linearly with a coefficient of $1/2$. This means the error is approximately halved at each step, once the error is small enough, which basically means that you gain a correct binary digit at each step. This gets even worse for higher order roots: for a root of order $n$ we have a coefficient of $1-\frac{1}{n}$.
There is a modified Newton method which can detect proximity to a non-simple root and modify $f$ in such a way that quadratic convergence is recovered. If you know the order of the root is $n$, then you can use
$$x_{k+1} = x_k - n \frac{f(x_k)}{f'(x_k)}.$$
A similar argument to the first one shows that this converges quadratically. If you don't know the order of the root, there is a method which can estimate it: see http://mathfaculty.fullerton.edu/mathews/n2003/NewtonAccelerateMod.html
Edit: the symbol $o(f(x))$, called "little oh notation", is a standard shorthand. It means an unspecified function $g(x)$ such that $g(x)/f(x) \to 0$ as $x$ tends to something specified by context (usually $0$ or $\infty$, in this case $0$). So for example, calculus tells us that
$$\lim_{h \to 0} \frac{f(x+h) - f(x) - h f'(x)}{h} = 0$$
which is the same as
$$f(x+h) - f(x) - h f'(x) = o(h)$$
Little oh notation has a weaker counterpart, called "Big oh notation". That is, the symbol $O(f(x))$ means an unspecified function $g(x)$ such that $g(x)/f(x)$ is bounded as $x$ tends to something specified by context.
Best Answer
See wikipedia. Since you're asking about a function with a multiple root, you probably want the answer in section 3.