Low-rank approximation of Eigenvalue decomposition

eigenvalues-eigenvectorslinear algebrasvd

Suppose we have a diagonalizable (possibly positive semidefinite, if it helps. But not symmetric) square matrix $A \in \mathbb{R}^{M\times M}$. We can perform an Eigenvalue decomposition $A = V \Lambda V^{-1}$, with the matrix of Eigenvalues $\Lambda = \text{diag}(\lambda_i)$ and the matrix of Eigenvectors $V$.

Let's say some Eigenvalues are zero, therefore $\text{rank}(A)=N<M$. In similar fashion to the Singular Value decomposition, we can order the Eigenvalues by $|\lambda_i|$ and only keep the nonzero ones, so that we obtain $\tilde \Lambda \in \mathbb{R}^{N\times N}$. We remove the corresponding Eigenvectors of V so that we end up with $\tilde V \in \mathbb{R}^{M\times N}$. Similarly, the corresponding rows in $V^{-1}$ are removed as well (after the inversion) to obtain $\tilde V^{-1}$. Through this procedure, we get the exact result $A = \tilde V \tilde \Lambda \tilde V^{-1}$.

Now my question is: Can this procedure also be used as a low-rank approximation method if $|\lambda_i|\approx 0, i>N$, similar to the SVD? I haven't found this in the literature, so are there any caveats which make this method unreliable?

Best Answer

The key issue is that the eigenvalues are not necessarily orthogonal. Consider $$ \begin{bmatrix} 1 & 0 \\ 0 & 1\\ \end{bmatrix} = \begin{bmatrix} 100 & 1 \\ 1 & 0\\ \end{bmatrix} \begin{bmatrix} 1 & 0 \\ 0 & 1\\ \end{bmatrix} ( \begin{bmatrix} 100 & 1 \\ 1 & 0\\ \end{bmatrix} )^{-1} $$ If you try your method, you shall see some entries that are on the order of 100 while the entries of the original matrix are all single digits.