Convergence of algorithm (bisection, fixed point, Newton’s method, secant method)

algorithmsbisectionconvergence-divergencefixed-point-theoremsnumerical methods

Consider solving the equation $\exp(x) – x – 1 = 0$.

It is obvious that $x^* = 0$ is a solution of the equation. Suppose you don't know this and you would like to approximate $x^*$ numerically by one of the following four methods. Which method will lead to a convergent algorithm for approximating $x^*$?

Why are the others not suitable?

The four methods are bisection method, fixed point iteration, Newton's Method, and Secant method.

Attempt: I tried each method on the equation but I have found that each converges which I know is not correct.

For fixed point iteration, I put the equation in the form $x = \exp(x) – 1$.

I don't know where to start in terms of the "inverval" to begin with i.e. with bisection the first thing I do is try $x=0$ and I get that it is the root. So am I done? Is bisection method the "correct" method in this case?

Best Answer

$$ \exp(x)-1-x=\frac12x^2\left(1+\frac13x+\frac1{12}x^2+...\right) $$ has a minimum in $x=0$ and behaves locally like $\frac12 x^2$.

plot of the function

Note that you pretend to not know the root. Starting the one of the methods at the point $x=0$ has thus to be considered extremely unlikely.

  • bisection method: fails before the first step as all function values away from $x=0$ have positive sign, no interval with sign alteration exists,
  • fixed point iteration: depends on the iteration function,
  • Newton's Method: converges linearly, and
  • Secant method: Convergence, as always, depends on the initial pair. If the secant becomes horizontal, the iteration can move far away from zero. It might recover or it might oscillate chaotically.