[Math] Zero Eigenvalues for Hessian Matrix

multivariable-calculusoptimization

I need to show that along any line passing through the origin, $$F(x,y) = 3x^4 -4x^2y + y^2$$ has a minimum at $(0,0)$ but that without the restriction, there is no local minimum at $(0,0)$.

The first part I can do easily, but for the second part I end up with a critical point at $(0,0)$ and a Hessian matrix which is
$$\left(\begin{matrix}
0& 0\\
0& 2\\
\end{matrix}\right)
$$
and thus has a zero determinant (meaning it is a degenerate critical point) and both eigenvalues equal to zero. Does this imply there is no minimum at $(0,0)$ or do I have to do something else? Thanks in advance.

Best Answer

I think you should re-look at your hessian, the hessian is often a symmetric matrix, and indeed it is in this case too.

Beyond that, if the hessian is degenerate or 0, you should think along the same lines as having the first and second derivatives of a one-variable function being zero, and what information this could give you (basically none).

So, what is to be done? Looking at your hint with lines, it's obvious that the function isn't going to get smaller than the value at 0 anywhere (or it would have on one of the lines), but it could be 0 along a curve. I think the only thing left to you is to be clever. If, for example, we look at what happens by taking $y=x^2$, we find that $$F(x,x^2) = 3x^4-4x^4+x^4=0$$ This is true for all x, including zero, so the point at zero is not a local min.

Related Question