Saddle point while being semidefinite

convex optimizationlinear algebramultivariable-calculusnonlinear optimizationoptimization

Suppose a function $f: \mathbb{R}^n \to \mathbb{R}$ has continuous second derivatives, and at point x it is a saddle point, I wonder if it is possible for the Hessian to be positive-semidefinite at that point

By just taking the directions of the eigenvalues with different signs, we can show an indefinite matrix must reach a saddle point at that sign, and that led to wondering if the converse is true.

My guess is that a certain function can be both semidefinite at a certain point while still being a saddle at that point. I think this can be constructed by taking a function with the second derivative being zero but being a maximum in one direction while taking a positive second derivative in another

However, I cannot come up with any concrete examples to show this. So is this statement even true and is there any concrete examples?

Best Answer

It is possible; consider, for example, $f(x,y) = x^2 - y^4$. The point $(0,0)$ is a saddle point with Hessian matrix $$Hf(0,0) = \begin{bmatrix}1 & 0 \\ 0 & 0 \end{bmatrix}$$ which is positive semidefinite.

Essentially, the second derivative does not give us any information about the behavior of $y^4$ near $0$.