No low-rank update that encodes the spectrum

linear algebramatrix decompositionnumerical linear algebrasvd

An important problem is the online computation of the eigenvalues of a covariance matrix $\Sigma_T=\sum_{t=1}^T x_tx_t^\top$. After some searching, as far as I can tell, it is not possible to efficiently compute the spectrum of $\Sigma_{T+1}$ after observing a new datapoint $x_{T+1}$, given any decompositions of $\Sigma_{T}$. In other words, there is no "incremental SVD" or Woodbury-type identity for SVD — most things which use the name incremental SVD in the literature seem (in my search) to be approximate algorithms or approximate updates to the thin SVD.

What's weird is there seems to be some barrier here: for instance, there is an efficient low-rank update to the Cholesky decomposition, but the spectrum of $\Sigma_{T+1}$ is not easy to read off from its Cholesky decomposition. Same goes for the QR decomposition. The spectrum would be easy to read off from the Schur decomposition, but there does not seem to exist an efficient low-rank update for the Schur decomposition.

So, I have 2 questions: first, is my impression that there does not exist an "incremental SVD" accurate? Second, is there an intuition for why this problem should be so hard?

Best Answer

If the prior real symmetric matrix $S$ has been orthogonally diagonalized, then a method for updating that factorization $S = Q\Lambda Q^T$ has been studied in the literature.

Note that $S + xx^T = Q(\Lambda + Q^Txx^T Q)Q^T$, so (up to an orthogonal similarity transformation) the rank-one symmetric update eigenvalue problem is the same as finding the spectrum of a "diagonal-plus-rank-one" (DPR1) matrix. The issue has been addressed here at Math.SE and was posted (although closed as "duplicate") at MathOverflow.

The difficulty lies in a need to compute numerically the new eigenvalues, from which the eigenvectors and (in principle) the updated orthogonal factors can be efficiently calculated. The 1992 paper by Gu and Eisenstat linked above discusses that backward stability of the algorithm in terms of what stopping criterion should be used for any root-finding method. They adopted the "rational interpolation strategy" used earlier by Bunch, Nielsen, and Sorenson (1978) for their numerical experiments, but opine that other methods should work: "What is most important is the stopping criterion...".

More recent work by Stor, Slapničar, and Barlow (2015) proposes "Forward stable eigenvalue decomposition of rank-one modifications of diagonal matrices" with a complexity of $O(n)$ per eigenvalue.