Singular Values and Matrix Rank

matrix-ranksingular valuessolution-verification

I came across this problem and I'm not sure if the proof I have found is OK. Any suggestions or confirmation of correctness would be appreciated.

The Problem:

Show that the number of non-zero singular values of a matrix $A$ coincides with its rank

My Approach:

We have an arbitrary matrix $A_{m\times n}$. For every eigenvector $v_i$ of the modulus $|A|$ of $A$ (where $|A|^2=A^*A$, $|A|$ being $n\times n$), and its corresponding eigenvalue $\lambda_i$ ($\forall i=1,\,2,\,…,\,n$), we have $(|A|-\lambda_i \mathbb{I}_n)v_i=0$. For every singular value $\sigma_i$ of $A$ equal to zero, the corresponding eigenvalue $\lambda_i$ of $|A|$ is also zero. Therefore, from the previous expression, for every $\sigma_i=0$, we have:

$$|A|v_i=0$$

So the problem comes down to solving this homogeneous system. Because $|A|$ is self-adjoint, there always exists a basis of eigenvectors of $|A|$, which means that the geometric multiplicity of any eigenvalue of $|A|$ is always equal to its algebraic multiplicity. Therefore, $ \dim Ker|A| = n-r$, where $r$ is the number of non-zero eigenvalues of $|A|$, and therefore also the number of non-zero singular values of $A$.

Using the well known equality, $\dim Ker|A|=n-\text{rank}|A|$, and the previous result, we have:

$$n-r=n-\text{rank}|A|\rightarrow r=\text{rank}|A|$$

From the definition of $|A|$, for any $x \in \mathbb{C}^n$:

$$\mid \mid |A|x \mid \mid^2 =(|A|x,\,|A|x)=(|A|^*|A|x,\,x)=(|A|^2x,\,x)=(A^*Ax,\,x)=(Ax,\,Ax)=\mid \mid Ax \mid \mid^2 \rightarrow$$
$$\rightarrow\mid \mid |A|x \mid \mid=\mid \mid Ax \mid \mid$$

This means $|A|x=0\Leftrightarrow Ax =0$, $\forall x \in \mathbb{C}^n$, so $KerA=Ker|A|$. Then:

$$\text{rank}A=n-\dim KerA=n-\dim Ker|A|=n-(n-\text{rank}|A|)=\text{rank}|A|$$

Therefore, as $\text{rank}|A|=\text{rank}A$, we have:

$$r=\text{rank}|A|=\text{rank}A$$

Therefore, the number of non-zero singular values of an arbitrary matrix $A_{m\times n}$ is equal to the rank of such matrix.

Best Answer

As mentioned by @user8675309, this is another possible approach:

The matrix $A$ can be decomposed with SVD in order to obtain $A=W \Sigma V^*$, with $W$ and $V^*$ being two square invertible matrices and $\Sigma$ being diagonal (with the singular values of $A$ in the diagonal), although not necessarily square. Then, $\text{rank}(A)=\text{rank}(W \Sigma V^*)=\text{rank}( \Sigma )$.

To prove $\text{rank}(W \Sigma V^*)=\text{rank}( \Sigma )$, we consider the case of two matrices $A$ and $B$, where $A$ is square invertible. Then, $\text{rank}(BA)\leq \text{rank}(B)$. Also, $\text{rank}(B)=\text{rank}(BAA^{-1})\leq \text{rank}(BA)\leq \text{rank}(B)$. Therefore, we have the expression:

$$\text{rank}(B)\leq \text{rank}(BA)\leq \text{rank}(B)$$

This means $\text{rank}(B)= \text{rank}(BA)$, so right multiplication by a square invertible matrix preserves rank. For left multiplication by square invertible matrices, we can take the transpose, because column rank is equal to row rank. We have:

$$\text{rank}(AB)=\text{rank}((AB)^T)=\text{rank}(B^TA^T)\leq \text{rank}(B^T)=\text{rank}(B)$$

Also, $$\text{rank}(B)=\text{rank}(B^T)=\text{rank}(B^TA^T(A^T)^{-1})\leq \text{rank}(B^TA^T)=\text{rank}((B^TA^T)^T)=\text{rank}(AB)\leq \text{rank}(B)$$

Therefore:

$$\text{rank}(B)\leq \text{rank}(AB)\leq \text{rank}(B)$$

This means $\text{rank}(B)= \text{rank}(AB)$, so left multiplication by a square invertible matrix also preserves rank.

The two previous results combined prove $\text{rank}(A)=\text{rank}(W \Sigma V^*)=\text{rank}( \Sigma )$.

Because the rank of $\Sigma$ is just the number $r$ of its nonzero rows (because it is diagonal), and $r$ coincides with the number of nonzero singular values, we can see that:

$$r=\text{rank}(A), \text{ where } r \text{ is the number of nonzero singular values of }A.$$