[Math] When is Newton’s Method guaranteed to converge to a good solution (non-linear system)

numerical methods

My knowledge of Newton's Method is partial. I am trying to understand what guarantees this method can give on the solution of systems of non-linear equations.

Specifically, I have a set of non-linear equations that are easily twice differentiable. What additional conditions do I need to fulfill in order to guarantee that Newton's Method finds a good solution? How important is the starting point? if it is important, how can I guarantee that I find a good starting point?

Best Answer

The items below should help you to look up further details of Newton's Method for system of nonlinear equations.

Advantages:

  • Q-quadratically convergent from good starting guesses if the Jacobian $J(x_*)$ is nonsingular
  • Exact solution in one iteration for an affine $F$ (exact at each iteration for any finite component functions of $F$)

Disadvantages:

  • Not globally convergent for many problems
  • Requires $J(x_k)$ at each iteration
  • Each iteration requires the solution of a system of linear equations that may be singular or ill-conditioned

References:

Notes

  • For Newton's method, you would choose a tolerance and use some vector norm to test that the result is good enough.
  • If you choose a bad starting value, all bets are off.
  • You might also want to look into quasi-Newton methods.
  • For good starting values, you want to look into the Steepest Descent method, which is used to find accurate starting approximations for the Newton-based techniques.
  • As an aside, you probably also want to look at and understand "Constrained" versus "Unconstrained" methods.
Related Question