Singular values and trace norm of a submatrix

linear algebramatricesnuclear normsingular values

Let $A$ be an $ m \times n$ matrix where $ m \leq n $, and let $ B$ the matrix obtained from $A$ by removing both its first row and its first column. Let us denote the singular values of $A$ by:
\begin{equation*}
\sigma_1 \geq \ldots \geq \sigma_m
\end{equation*}

and the singular values of $B$ by:
\begin{equation*}
\lambda_1 \geq \ldots \geq \lambda_{m-1} .
\end{equation*}

My question is: which interlacing inequalities apply here? According to some books, we can only say:
\begin{equation}
\sigma_i \leq \lambda_{i-2}
\end{equation}

while other sources seem to make the stronger claim:
\begin{equation}
\sigma_m \leq \lambda_{m-1} \leq \sigma_{m-1} \leq \ldots \leq \sigma_2 \leq \lambda_1 \leq \sigma_1.
\end{equation}

So which one is it? And if the latter does not hold, can we at least prove the following?
\begin{equation}
\sum_{i=2}^m \sigma_i \leq \sum_{i=1}^{m-1} \lambda_i
\end{equation}

Best Answer

The inequality $\sigma_1(A)\ge\sigma_1(B)\ge\sigma_2(A)\ge\sigma_2(B)\ge\cdots\ge\sigma_{m-1}(A)\ge\sigma_{m-1}(B)\ge\sigma_m(A)$ holds if $A$ is a (square) positive semidefinite matrix. It doesn't hold in general, not even if $A$ is Hermitian. E.g. when $$ A=\pmatrix{0&0&1\\ 0&0&0\\ 1&0&0}, $$ the three singular values of $A$ are $1,1,0$ but the two singular values of $B$ are $0,0$. In this counterexample, we also have $\sum_{i=2}^m\sigma_i(A)=1>0=\sum_{i=1}^{m-1}\sigma_i(B)$.

Another counterexample: let $$ A=\pmatrix{0&3&0\\ 2&0&-2\\ 1&0&1}. $$ The three singular values of $A$ are $3,2\sqrt{2},\sqrt{2}$ and the two singular values of $B$ are $\sqrt{5}$ and $0$. Here we have $\sigma_2(A)=2\sqrt{2}>\sqrt{5}=\sigma_1(B)$ and $\sum_{i=2}^m\sigma_i(A)=3\sqrt{2}>\sqrt{5}=\sum_{i=1}^{m-1}\sigma_i(B)$.

It is true that $\sigma_i(A)\le\sigma_{i-2}(B)$ for $3\le i\le\min\{m,n\}$. Actually, if we delete a row (resp. a column) of $A$ to obtain a matrix $C$, we get $\sigma_j(A)\le\sigma_{j-1}(C)$. Similarly, if we delete a column (resp. a row) of $C$ to obtain a matrix $B$, we get $\sigma_k(C)\le\sigma_{k-1}(B)$. Combine the two inequalities, we get $\sigma_i(A)\le\sigma_{i-2}(B)$.

Interestingly, the inequality $\sigma_j(A)\le\sigma_{j-1}(C)$ can be obtained from the interlacing inequality $\lambda_1(A)\ge\lambda_1(B)\ge\cdots\ge\lambda_{m-1}(A)\ge\lambda_{m-1}(B)\ge\lambda_m(A)$ for eigenvalues of Hermitian matrices. For a proof, see corollary 7.3.6 of Horn and Johnson's Matrix Analysis (2nd ed.).

Related Question