[Math] How to express the Frobenius norm of a Matrix as the squared norm of its singular values

linear algebramachine learningmatricesnormed-spaces

Let the Frobenius norm of an m by n ($m \times n$) matrix M be:

$$|| M ||_{F} = \sqrt{\sum_{i,j} M^2_{i,j}}$$

I was told that it can be proved that, if M can be expressed as follows (which we can because of SVD):

$$ M = \sum^{r}_{i=1} \sigma_i u_i v^T_i$$

Then one can show that the Frobenius norm equivalently be expressed as:

$$ || M ||_{F} = \sqrt{\sum_{i} \sigma_i^2} $$

I was a little stuck on how to do such a proof. This is what I had so far:

I was thinking that maybe since the second expression is a linear combination of outer produced scaled by $\sigma_i$, then one could express each entry of M as follow: $M_{i,j} = \sum^{r}_{i=1} \sigma_i (u_i v^T_i)_{i,j}$. Thus we can substitute:

$$|| M ||^2_{F} = \sum_{i,j} M^2_{i,j} = \sum^n_{j=1} \sum^m_{i=1} (\sum^{r}_{i=1} \sigma_i (u_i v^T_i)_{i,j})^2 = \sum^n_{j=1} \sum^m_{i=1} (\sum^{r}_{i=1} \sigma_i (u_i v^T_i)_{i,j}) (\sum^{r}_{i=1} \sigma_i (u_i v^T_i)_{i,j}) $$

After that line I got kind of stuck. Though my intuition tells me that if I expand what I have somehow, something magical is going to happens with the combination of outer products of orthonormal vectors and get a bunch of zeros! Probably by re-arranging and forming inner products that evaluate to zero (due to orthogonality) … Though, not sure how to expand that nasty little guy.

Anyway has any suggestion on how to move on or if maybe there is a better approach?

Best Answer

$\sum_{i}\sigma_i^2=Trace(\Lambda \Lambda^T)$ where $M=U\Lambda V^T$. Then, $$\|M\|_F^2=Trace(MM^T)=Trace(U\Lambda V^TV\Lambda^T U^T)=Trace(U\Lambda \Lambda^TU^T)=Trace(\Lambda\Lambda^T U^T U)=Trace(\Lambda\Lambda^T)$$