Convex Optimization – Semi-positive Definite Hessian Matrix and Local Minimum

convex optimizationlinear algebraoptimization

Suppose we have a function $F(x)$ defined as
\begin{equation}
F(x) = \frac{1}{2}x^TAx + b^Tx +c,
\end{equation}
where
\begin{equation}
A = \begin{bmatrix}
4 & 2 \\
2 & 1 \\
\end{bmatrix} , \quad b = \begin{bmatrix}
-4 \\
-2 \\
\end{bmatrix},
\end{equation}
and $c$ is a constant scalar.
The question is: is the stationary point a local minimum?

I find $\nabla F(x) = Ax + b.$ So $\nabla F(x) = 0 $ implies the stationary point (in fact it is a stationary line) is $2x_1 + x_2 =2$,
where $x = [x_1, x_2]^T$.

The Hessian matrix $H(x)$ is
\begin{equation}
H(x) = \nabla^2 F(x) = A,
\end{equation}
and the eigenvalues are $0$ and $5$, so $H(x)$ is semi-positive definite. How can tell if the stationary point is local minimum in this case? In general, how to decide if the stationary point is local minimum, local maximum or saddle if
$H(x)$ is semi-positive definite or semi-negative definite?

Thanks!

Best Answer

Since the hessian is positive semidefinite for all $x$, the function is convex (though not strictly convex). So the stationary point is a minimum, and a global minimum in fact (by convexity). Think of it this way - the function is increasing in the direction of the eigenvector with eigenvalue 5, and flat in the direction of the eigenvector with eigenvalue 0. If the Hessian were negative semidefinite, you would have a global maximum.