[Math] Newton optimization algorithm with non-positive definite Hessian

nonlinear optimization

In the newton optimization algorithm to find the local minimum $x^*$ of a non-linear function $f(x)$ with iteration sequence of $x_0 \rightarrow x_1 \rightarrow x_2 … \rightarrow x^*$ all $\nabla ^2 f(x_k)$ should be pos. definite otherwise search direction might not be a descent direction. The solution to this problem is to use Hessian modification techniques.

I am looking for a simple function that has local minimum at $x^*$, but $\nabla ^2 f(x_k)$ is not positive definite, and hence the Newton algorithm fails. Then after using one of the Hessian modification techniques, Newton algorithm converges to $x^*$

Best Answer

I found a good example:

$f(x_1,x_2)=(1.5-x_1+x_1*x_2)^2 + (2.25-x_1+x_1*x_2^2)^2 + (2.625-x_1+x_1*x_2^3)^2$

Dampted Newton method with starting point $x_0 = (4,1)^T$ fails. This is why:

$\nabla^2 f(x_0)= \left( \begin{matrix} 0 & 27.75 \\ 27.75 & 610 \\ \end{matrix} \right) $ and search direction is $p_0=-\nabla^2 f(x_0)*\nabla f(x_0)=(-4,0)^T$

The search direction is pointing to a wrong direction! and this is because Hessian matrix is not positive definite and thus $p_0$ is not in the descent direction.

enter image description here

After using the Hessian modification technique such as "adding a multiple of the identity", the algorithm works fine and the search direction points to the corect direction, and eventually finds the local minimum $(3,0.5)^T$

enter image description here

Related Question