Convergence Rate and Newton’s Method

nonlinear systemnumerical methods

I have a few nonlinear functions that I am using Newton's method to solve and am mainly interested in computing the convergence rate (or lack there of) each:

  • $x^2 = 0$
  • $x^3 = 0$
  • $x + x^3 = 0$
  • $x + x^4 = 0$

Taking the first equation for example:
$$x_{k+1} = x_k – \frac{x_k^2}{2x_k} = \frac{x_k}{2}$$
Now my textbook simply states $x_k$ converges to zero but only with a linear rate. Apparently this is obvious, but I don't see why. It also goes on to state:
$$\lim_{x \to 0}\frac{|\nabla^2f(x)|}{|\nabla f(x)|} = \infty$$
which means a quadratic convergence is unbounded.

Can someone explain the test for convergence rate of Newton's method?

From my understanding it is the following:

  1. if $|x^{k+1} – x_*| \leq \gamma |x^{k} – x_*|$ then $\{x^k\}$ is said to converge to $x_*$ linearly
  2. if $|x^{k+1} – x_*| \leq c|x^{k} – x_*|^P$ then $\{x^k\}$ is said to converge to $x_*$ with rate at least P

Questions:

Are $\gamma$, $c$, and $P$ simply arbitrary constants?

Why is it obvious that equation 1 converges linearly? What about the others?

This response to a similar question seems to compute the multiplicity of the roots of each equation but divergence is never mentioned

Best Answer

Let $c$ be a fixed point of a function $g$, i.e. $g(c) = c$. I will assume the definition that the iteration sequence $x_k = g^{k}(x_0)$ has order of convergence $p \geq 1$ if $$\lim_{k \to \infty}\frac{|x_{k + 1} - c|}{|x_{k} - c|^p} = C$$ for some $C \neq 0$. If $p = 1$ it is required that $C < 1$. Of course definitions vary from place to place, so adjust accordingly. In your example, $$\lim_{k \to \infty}\frac{|x_{k + 1} - 0|}{|x_{k} - 0|} = \frac{1}{2} < 1$$ so the convergence is linear.

Here is a rather simple test for convergence rate:

Suppose you already know $x_k$ converges to $c$ and you want to find the order of convergence. Let $p$ be the smallest positive integer for which $g^{(p)}(c) \neq 0$. Then by the Lagrange remainder theorem, $$x_{k + 1} = g(x_k) = g(c) + \frac{g^{(p)}(\xi_k)}{p!}(x_k - c)^p = c + \frac{g^{(p)}(\xi_k)}{p!}(x_k - c)^p.$$ where $\xi_k$ is between $c$ and $x_k$. By the squeeze theorem, $\lim_{k \to \infty}\xi_k = c$. Hence assuming $g^{(p)}$ is continuous at $c$, $$\lim_{k \to \infty}\frac{|x_{k + 1} - c|}{|x_{k} - c|^p} = \frac{|g^{(p)}(c)|}{p!}$$ so the convergence is of order $p$, except that if $p = 1$ it is also necessary that $|g'(c)| < 1$.

Whether or not the iteration converges, and what fixed point it converges to depends on the function $g$ and the starting point $x_0$.

Edit: To find a solution $c$ to $f(c) = 0$, Newton's method constructs a function $$g(x) = x - \frac{f(x)}{f'(x)}$$ and tries to find a fixed point of $g$ via the iteration $x_k = g^{k}(x_0)$ since $g(c) = c \iff f(c) = 0$. So $$g'(x) = \frac{f(x)f''(x)}{f'(x)^2},$$ $$g''(x) = \dots$$ etc. You can verify that if $c$ is a root of $f$ of multiplicity $m$, that is, $$f(x) = (x - c)^{m}h(x)$$ for some function $h$ with $h(c) \neq 0$, then $g'(c) = 1 - \frac{1}{m}$ and thus the convergence is only linear when $m > 1$.