Symmetric low rank approximate factorization of symmetric matrix

approximationmatricesmatrix decompositionmatrix-ranksvd

I am reading a paper and basically there is a covariance matrix (symmetric and positive semi-definite by property) $V \in R^{d \times d}$ and we want to write $V \approx QQ^T$ with $$Q \in R^{d \times a}, a \ll d$$
It is written we can do this by truncated SVD. I am a little unsure whether I understand this correctly, can someone please verify the steps I think is required to make the approximation?

Firstly, I know we can write $V = B \Lambda B^T $ where $B$ is an orthogonal basis given by the eigenvectors(+ Gram Schmidt Orthogonalization) and $\Lambda$ is a diagonal matrix consisting of the eigenvalues of $V$. Rewrite it as $V = CC^T$ where $C = B\sqrt\Lambda$.
Next we find SVD, $C = U\Sigma V$ and make an approximation by truncating the SVD to first $a$ values and get a matrix $D \approx C$. Finally, $V \approx D D^T$.
However, there is one problem in this approach I think. Matrix $D$ obtained is $d \times d$ and not $d \times a$ as required. Am I doing something wrong?

Note: The words exactly used in the paper was – symmetric low rank approximation factorization

Best Answer

Once you have $V=B\Lambda B^T$, this is an SVD decomposition of $V$.

We assume that the diagonal entries are sorted in non-increasing manner. Let $B=[B_1, B_2]$ where $B_1$ has $a$ columns and $\Lambda = \begin{bmatrix} \Lambda_1 & 0 \\ 0 & \Lambda_2\end{bmatrix}$ where $\Lambda_1 \in \mathbb{R}^{a \times a}$, then

$$Q=B_1\sqrt{\Lambda_1} \in \mathbb{R}^{d \times a}$$

Related Question