[Math] Using SVD to approximate matrix-vector multiplication

approximationlinear algebramatricesmatrix decompositionsvd

Given some matrix A, is it possible to use Singular Value Decomposition to approximate Ax for some vector x within some error bound? According to Efficient low rank matrix-vector multiplication, it should be possible in O(m+n) time (where A is m by n). I'm not really sure how to get information from that answer though.

Best Answer

Throwing away small singular values gives a low rank approximation, call it $L$, to $A$. Since the Euclidean operator norm of a matrix is the largest singular value, $\| L-A \|_2$ is the largest singular value that was thrown away; call it $\sigma_k$. Hence

$$\frac{\| Lx-Ax \|_2}{\| x \|_2} \leq \sigma_k.$$

If you leave behind $\ell$ singular values, then calculating $Lx$ with this SVD approach requires you to multiply by a $\ell \times n$ (usually dense) matrix, then a $\ell \times \ell$ diagonal matrix, then a $m \times \ell$ (usually dense) matrix. Overall this multiplication step takes essentially $\ell m + \ell + \ell n$ time. (Of course getting the SVD the first time takes much longer than that.) This is $O(m+n)$ if $\ell$ is fixed, but holding $\ell$ fixed is not necessarily what you want to do--it depends on the $A$s that you are considering in your application.

Related Question