[Math] When to use Newtons’s, bisection, fixed-point iteration and the secant methods

bisectionfixed-point iterationnewton raphsonnumerical methodsroots

I've been introduced more or less to these methods of finding a root of a function (a point where it intersects the $x$ axis), but I'm not sure when they should be used and what are the advantages of one method over the other.

I think that it would be nice to have an answer that puts them in comparison and shows the situations where one method would be more advantageous than the other, etc.

So, my question are:

  • When should we use one method over the other and why?

  • And, what are the advantages and disadvantages therefore of one method over the other?

Also, if you want to provide a brief explanation of each of the method, I think that the answer would be more complete and interesting.

Best Answer

You should never seriously use bisection.

If you think that derivatives are hard, use the secant method.

If you want to force convergence and can find intervals with opposite signs of the function, then use one of the anti-stalling variants of regula falsi. If you think that convergence could be faster, use a method based on the inverse quadratic like Muller or Brent.

If derivatives are not so hard, then use Newton. To encourage convergence, combine with line-search.

Related Question