[Math] Proof that Hessian matrix of Lagrangian function can not be positive definite

matricesoptimization

This is a homework problem I have a hard time to understand. Any tips would be appreciated to get me in the right direction.

Given functions $f: \mathbb{R}^n \to \mathbb{R}$ and $\boldsymbol{g}: \mathbb{R}^n \to \mathbb{R}^m$ you can minimize $f(\boldsymbol{x})$ given the constraint $\boldsymbol{g}(\boldsymbol{x}) = \boldsymbol{0}$ by solving the gradiant of the Lagrangian function to zero:
$$
\bigtriangledown \mathcal{L}(\boldsymbol{x}, \boldsymbol{y}) = \begin{bmatrix} \bigtriangledown f(\boldsymbol{x}) + \boldsymbol{J}_g^T(\boldsymbol{x})\boldsymbol{\lambda} \\ \boldsymbol{g}(\boldsymbol{x})
\end{bmatrix} = \boldsymbol{0}
$$
Where $\boldsymbol{J}_g^T(\boldsymbol{x})$ is the Jacobian matrix of $\boldsymbol{g}(\boldsymbol{x})$.

The Hessian of this matrix can be computed as follows.
$$
\boldsymbol{H}_{\mathcal{L}} (\boldsymbol{x}, \boldsymbol{y}) =
\begin{bmatrix} \boldsymbol{B}(\boldsymbol{x}, \boldsymbol{y}) & \boldsymbol{J}_g^T(\boldsymbol{x}) \\
\boldsymbol{J}_g(\boldsymbol{x}) & \boldsymbol{0}
\end{bmatrix}
$$
Where $\boldsymbol{B}(\boldsymbol{x}, \boldsymbol{y}) = \boldsymbol{H}_f(\boldsymbol{x}) + \sum_{i=1}^m \lambda_i \boldsymbol{H}_{gi}(\boldsymbol{x})$

How can I prove that $\boldsymbol{H}_{\mathcal{L}} (\boldsymbol{x}, \boldsymbol{y})$ can not be positive definite?

Best Answer

For any vector $(a,b)$ you have $$ \begin{bmatrix} a^T & b^T \end{bmatrix} \begin{bmatrix} B & J^T \\ J & 0 \end{bmatrix} \begin{bmatrix} a \\ b \end{bmatrix} = a^T B a + 2 b^T J a. $$ Now choose $a = 0$ and any $b$. The product above is zero. Thus the matrix cannot be positive definite.

It's more interesting to try to discover under which conditions it is nonsingular and show that it is always indefinite.