Numerical Methods – Modified Newton’s Method

convergence-divergencenumerical methods

I don't understand how to find the rate of convergence of a root-finding method.

My professor drew out the standard Newton's method and explained that it has quadratic convergence. Then he asked how to find the rate of convergence if the method was modified to
$$x_{n+1}=x_n-\frac{f(x_n)}{f'(x_0)}$$

Is it still converging quadratically? How do I best show this mathematically?

UPDATE

I found a place that showed the error definition of the original method. I tried to substitute my change into that proof, but it relied on Taylor Expansions for the prime term, which wasn't present in the same form for my problem… Dead End.

Then I made a drawing using a basic $f(x) = x^2$ graph and realized that since the slope of the tangent line to the x-axis is constant, it produces similar triangles.

Then looking at the ratios of the areas of the triangles from each iteration, I should be able to derive a rate of convergence knowing that the areas of similar triangles with scale factor $a:b$ are proportionate to $a^2:b^2$.

Best Answer

Here is one approach to understanding Newton's method that generalizes easily to your situation. Let $n(x)=x-f(x)/f'(x)$. Newton's sequence approximating a root is defined recursively by $x_{k+1}=x_k-f(x_k)/f'(x_k)$ or $x_{k+1}=n(x_k)$. Thus, if $c$ is a root of $f$, we are interested in the difference $n(x)-c$, when $x$ is close to $c$. This can be estimated using a series expansion of $n$ about $c$:

$$n(x) \approx c+\frac{f''\left(c\right)}{2 f'\left(c\right)}\left(x-c\right){}^2+O\left(\left(x-c\right){}^3\right).$$

From here, it's pretty easy to see that the difference between $n(x)$ and $c$ is proportional to $(x-c)^2$, i.e. you expect quadratic convergence of the sequence defined recursively by $x_k=n(x_{k-1})$.

Now, try the same thing with $n(x)=x-f(x)/f'(x_0)$, where $x_0$ is the fixed first term in your sequence. Again, expanding $n$ about $c$, where $f(c)=0$, we get

$$n(x) \approx c+\left(1-\frac{f'(c)}{f'\left(x_0\right)}\right)(x-c) - \frac{f''(c)}{2 f'\left(x_0\right)}(x-c)^2 +O\left((x-c)^3\right).$$

Quadratic convergence is now lost due to the first order term.

We can use this series to help formulate some concrete examples. The critical issue is the absolute value of $1-f'(c)/f'(x_0)$:

  • If $|1-f'(c)/f'(x_0)|>1$, we have divergence,
  • If $0<|1-f'(c)/f'(x_0)|<1$, we have linear convergence,
  • If $1-f'(c)/f'(x_0) = 0$, we have quadratic convergence.

Of course, those statements all assume that $x_0$ is sufficiently close to $c$.

Now, consider examples of the form $f(x)=x^2-c^2$, which has a root at $x=c$. Then, our series expansion becomes

$$n(x) \approx c + \left(1-\frac{c}{x_0}\right)(x-c) + \frac{1}{2x_0}(x-c)^2+ O\left((x-c)^3\right).$$

Comment: In fact, the second order approximation is exact for this family of functions, but that's not necessary for this approximation technique to work.

It's now very easy to produce specific types of behavior in this family. Whenever, $0<c<x_0$, for example, we have $0<1-c/x_0<1$ so we are guaranteed linear convergence. Even more specifically, if $c=2$ and $x_0=4$, then $1-c/x_0 = 1/2$ and this modified method generates a sequence whose difference from the root $c=2$ is cut about in half with each iterate. On the other hand, if $c=4$ and $x_0=1$, then $1-c/x_0=-3$ and we'll generate a divergent sequence.

Finally, consider $f(x)=2 x - x^3 + x^5$. Note that $c=0$ is a root of $f$ and that $f'(c)=2$. Furthermore, $f'(\sqrt{3/5})=2$. Thus, if we start this modified Newton's method at $x_0=\sqrt{3/5}$, then we might expect quadratic convergence. In fact, we get even better as $n(x) = (x^3-x^5)/2$ and any $x_0$ in $[0,1]$ leads to a sequence with cubic convergence.

Related Question