[Math] Iterative approaches to the solution of nonlinear equations $f(x) = 0$

numerical methods

I am currently studying for a midterm, and I am review over the following methods:

  • Fixed point method
  • Bisection method
  • Regula Falsi method
  • Newton-Raphson
  • Accelerated Newton-Raphson
  • Secant

I know how to use the methods, however I am more concerned with when to use them. I've been asked several times in the book if a method can be used with an equation, but I just can't seem to come up an answer.

Example: $1$. Let $g(x) = x^2+x-4$. Can fixed-point iteration be used to find the solution(s) to the equation $x=g(x)$ ? Why?

The answer is No.

Example: $2$. Can Newton-Raphson iteration be used to solve $f(x) = 0$ if $f(x) = x^{1/3}$ ? Why?

Also No.

Can someone quickly summarize when it is appropriate to use a method and when it is not?

Much appreciated!

Best Answer

First question: We look at the equation $x=g(x)=x^2+x-4$. The solutions are $x=\pm 2$. If we are phenomenally lucky, and, say, make the initial estimate $x=2$, everything is fine. Now let's see what happens if we make an initial estimate which is good but not dead on. Say we are using fixed point iteration to estimate the positive root $x=2$.

The problem is that $x=2$ is a repelling fixed point. The reason it is repelling is that $g'(x)=2x+1$, so near $x=2$, $|g'(x)|$ is near $5$, quite a bit greater than $1$.

Roughly speaking, if we are dealing with a "nice" function, and the derivative at the root has absolute value substantially less than $1$, the root will be an attracting fixed point. In that case, if we start close enough to the root, fixed point iteration sucks us into the root. But if the absolute value of the derivative is greater than $1$, we are driven away from the root.

One can draw a very nice picture of the process, and see the repulsion at work. In the absence of a picture, we will use a formula. Note that $g(2)=2$. Let $x$ be near $2$ but not equal to $2$. We have $$\frac{g(x)-2}{x-2}=\frac{g(x)-g(2)}{x-2}\approx g'(2)=5.$$ Thus if $x$ is "near" $2$, then $$g(x)-2\approx 5(x-2).$$ This means that $g(x)$ is about $5$ times as far from $2$ as $x$ is from $2$.

If we have a good estimate $x_n$ for the positive root, then $x_{n+1}$, the next estimate, is $g(x_n)$, and thus $x_{n+1}$ is about $5$ times further away from the root than $x_n$ was.

The same issue arises at the negative root $x=-2$. We have $g'(-2)=-3$, so the derivative has absolute value $3$. Again, $x=-2$ is a repelling fixed point.

Second question: For the Newton-Raphson process question, the root is of course $0$. The problem is that the derivative of $x^{1/3}$ blows up as we approach the root. In a sense, the issue is somewhat similar to the one in the first question, since Newton-Raphson can be thought of as a sophisticated form of fixed point iteration. We are doing fixed point iteration to solve $g(x)=x$, where $g(x)=x-\frac{f(x)}{f'(x)}$.

For Newton-Raphson for $f(x)=0$, here is a rough use guideline. It should behave fairly nicely if (i) we start close enough to the root and (ii) $f(x)$ is twice differentiable at and near the root and (ii) $f'(x)$ is not $0$ at the root. There is another possible issue. If the root is a multiple root (silly example $f(x)=x^4$) then the convergence will be slow.

For the particular example in the question, we can compute explicitly and see very clearly what happens.

Recall that the Newton-Raphson iteration for approximating solutions of $f(x)=0$ is given by $$x_{n+1}=x_n -\frac{f(x_n)}{f'(x_n)}.$$

Let $f(x)=x^{1/3}$. then, at least for positive $x$, we have $f'(x)=-(1/3)x^{-4/3}$. Substitute in the Newton-Raphson iteration. After some simplification, we obtain $$x_{n+1}=4x_n.$$ Very bad news! Even if we are "lucky" enough to start with a good approximation $x_0$ to the root, say $x_0=1/1000$, after one iteration we will be $4$ times as far away from the truth, after two iterations we will be $16$ times as far away from the truth, and so on.

Related Question