Showing a self-adjoint matrix is positive semidefinite iff all of its eigenvalues are $\geq 0$

linear algebra

I want to show:

A self-adjoint matrix is positive semidefinite iff all of its eigenvalues are $\geq 0$

To me, this isn't obvious so I'm having a hard time getting started…


$(\implies)$

The definition of self adjoint is:

$$
\begin{align*}
\langle A \pmb y, \pmb x \rangle &= \langle \pmb y, A \pmb x \rangle \\
\end{align*}
$$

Assuming that the inner product is characterized by $I$ we have

$$
\begin{align*}
(A\pmb y)' \pmb x &= \pmb y' A \pmb x\\
\pmb y' A' \pmb x &= \pmb y' A \pmb x
\end{align*}
$$

But I'm not sure how to show this means the eigenvalues are $\geq 0$

$(\impliedby)$

I'm not sure where to go after assuming all the eigenvalues are $\geq 0$

Maybe understanding where to go from the $(\implies)$ part will make it more obvious what to do this direction.

Best Answer

You only stated the self-adjointness hypothesis.

If the base field is $\Bbb C$, it implies that all the eigenvalues are real.

A self-adjoint matrix is positive semidefinite, by definition, if $\langle x,Ax\rangle\ge 0$ for every vector $x$

One direction should be immediate: if $A$ has a negative eigenvalue $\lambda$ with eigenvector $v$, then $$\langle v,Av\rangle=\lambda\langle v,v\rangle<0\,.$$ For the other direction you need to apply the spectral theorem, that there's an orthonormal basis of eigenvectors.