[Math] Low-rank factorization of SPD matrix

linear algebramatricesnumerical linear algebra

I have a symmetric positive definite (SPD) matrix $A$ that needs to be factorized as ${A=SS^{T}}$. However, using the Cholesky decomposition for this purpose is prohibitive in terms of computational cost. Moreover, the matrix is dense and has a slow decaying eigen-spectrum. Can anything be suggested for replacement of Cholesky?

Moreover, it need not be exact. Anything approximate will work as long as $y=Sz$ and the quantity $y^{T}y$ is what I am trying to preserve, where $z$ is standard normal vector.

Best Answer

Sarlos and Clarkson and Woodruf give efficient streaming algorithms for the more general problem of computing an approximate singular value decomposition of an arbitrary matrix. Because of the sublinear space streaming constraints, they look at "tall" matrices: much fewer columns than rows. But the algorithm should work and give efficiency gains for square matrices as well. The idea is to project the columns onto a much smaller dimension, similarly to the proof of the Johnson-Lindenstraus lemma, and then work with this smaller matrix.

Related Question