Minimum of $x_1^2+x_2^4$

calculushessian-matrixmaxima-minimamultivariable-calculusoptimization

I am asked to find a minimum of $f(x_1, x_2) = x_1^2+x_2^4$ applying the optimality conditions.

I am stuck. What I have found is not conclusive. I show what I have done:

Testing the First-Order Necessary Conditions

I computed the gradient:

$\left(\frac{\partial f}{\partial x_1}=2x_1 , \frac{\partial f}{\partial x_2}=4x_2^3\right)$
Now when doing:
$\left(\frac{\partial f}{\partial x_1}, \frac{\partial f}{\partial x_2}\right)=0$ I obtain the stationary point $x^*=[0, 0]$.

Now,

Testing the Second-Order Necessary Conditions

Obtaing the Hessian matrix of the form:
$[[2,0],[0, 12x_2^2]]$. And the last eigenvalue is not constant? So now, I do not know how to proceed. By definition, it has to be positive semi-definite in order to possibly be an optimiser (or positive definite, then minimum for sure), but here… I do not know what to do.

Thank you

Best Answer

Well, it is $0$ since $x_1^2\geq 0$ and $x_2^4\geq 0$ and you don't need any fancy stuff like Hessian.