[Math] Difference: Newton’s method, Newton-Rhapson method, Gauss Newton-method.

nonlinear optimizationnonlinear system

I would appreciate some clarification w.r.t. algorithms for solving nonlinear systems of equations.

1 – I don't understand the difference between Newton's method and Newton-Rhapson method. In [1], Newton's method is defined using the hessian, but Newton-Rhapson does not. However but I'm afraid they are actually the same thing, since I implemented both and the results were the same across different iterations.

2 – From my understanding, the Gauss Newton-method is a particular case of Newton's method to minimize least squared residuals. Is this correct?

Thanks in advance!

Peter

[1] – www2.imm.dtu.dk/pubdb/views/edoc_download.php/3215/pdf

Best Answer

Newton and Newton-Raphson are just different names for the same method. Sometimes Newton-Raphson is prefered for the scalar/univariate case.

Standard Newton for a vector valued function $F$ (no. equations = no. variables) determines the update step $s=x_+-x$ by solving $F'(x)s=-F(x)$.

Gauß-Newton applies to over-determined systems (no. equations > no. variables) and minimizes the error in $\|F'(x)s+F(x)\|_2$ for the update step. For a quadratic system this reduces to standard Newton.

One could also solve an overdetermined system by minimizing the quadratic error $\|F(x)\|_2$. To determine the zeros of the gradient by Newton's method would here require second derivatives of $F$ which are not needed for Gauß-Newton.

Related Question