Deriving conditions positive semidefiniteness Schur complement

linear algebramatricespositive-semidefinitestatistics

I am currently struggling to calculate the following conditions (marked in yellow) for the positive semidefiniteness of the matrix $G$ in (2.1). From a paper, I am reading:

enter image description here

Note that $\Sigma$ is assumed to be positive semidefinite.

I understand how to get from (2.1) to (2.2) by calculating the Schur complement. We can also see that $A$ is the Schur complement of the matrix

$$\begin{bmatrix}
\Sigma & diag(s) \\
diag(s) & 2diag(s)
\end{bmatrix}$$

by just applying the formula for the Schur complement "backwards". However, I don't understand how to derive from there the last two final conditions marked in yellow, i.e. how to show that $A \succeq 0$

Best Answer

It's a direct application of a (non-strict) Schur complement, but this time over the diagonal matrix instead of $\Sigma$.

With $S = diag(s)$ $$\begin{bmatrix} \Sigma & S\\ S & 2S \end{bmatrix} \Leftrightarrow \Sigma - S\left(2S \right)^{\dagger}S\succeq 0, ~S\succeq 0, (I-SS^{\dagger})S=0$$

Simplify exploiting the simple structure and you are done.

Related Question