Rank-1-update for a low-rank matrix

eigenvalues-eigenvectorslinear algebra

Context. Suppose that we have the following eigendecomposition:
$$ A = Q D Q^T,$$
where $D$ is diagonal, non-negative, non-decreasing and the last $r – 1$ out of $p$ eigenvalues are nonzero. Here $Q$ is uniquely defined for the last $r-1$ eigenvectors, and the first $p-r+1$ columns in $Q$ can be anything that's orthonormal and spans the orthogonal complement of the last $r-1$ ones.

Consider a rank-1-update $A \rightarrow A + w w^T = U F U^T$, where $F$ is diagonal with non-negative, non-decreasing eigenvalues on the diagonal, with at most $r$ non-zero eigenvalues. Again, only the last $r$ columns of $U$ matter.

Note that $U F U^T = Q (D + vv^T) Q^T = (QV) F (QV)^T$, so that $U = QV$ and $v=Q^Tw$; here the last $r$ (and the only needed) columns of $V$ are given by the Bunch-Nielsen-Sorensen formula.

Question. How can we compute the last $r$ columns in $U$ using only the last $r-1$ columns in $Q$ and the last $r$ columns in $V$?

Solution complexity. We assume that $r \ll p$, and we're looking for an $O(rp^2)$ solution tops. In particular, taking any orthogonal complement of the last $r-1$ rows in $Q$ is unacceptable, since that will give an $O(p^3)$. Having an extra multiplicative $\log$ term in addition to $rp^2$ would be unpleasant, but manageable.

Clarification. I'm writing $Q$ as a matrix to make things simpler, but perhaps it should be viewed as a union of $r-1$ columns, plus the space that orthogonally complements them. We don't know much about that space, and we don't want to compute an ONB for it, because that is $O(p^3)$-expensive.

Thank you in advance!

Best Answer

Let $q_1,\dots,q_p$ denote the columns of $Q$. Let $\lambda_1,\dots,\lambda_p$ denote the diagonal elements of $D$, in order (which are each eigenvalues of $Q$).

Why is it that the first $p-r+1$ columns in $Q$ make no difference?

Note that we have $$ A = QDQ^T = \sum_{i=1}^p \lambda_i \,q_iq_i^T. $$ Note that if $\lambda_i = 0$, then $q_i$ does not affect the resulting sum.

How can we compute the last $r$ columns in $U$ from the last $r-1$ columns in $Q$ and the last $r$ columns in $V$?

Thoughts so far:

Partition the matrix $Q$ into $Q = [Q_1\ \ Q_2]$, where $Q_2$ has $r-1$ columns. Let $D_2$ denote the bottom-right size $r-1$ submatrix of $D$. We are interested in the eigenvectors $A + ww^T$. Equivalently, it suffices to consider the eigenvalues of $Q^T(A + ww^T)Q$; as you note, this can be written as $D + vv^T$ where $v = Q^Tw$. However, we want to proceed while computing as little of $Q_1$ as we can.

It should be sufficient to compute its last column.

Decompose $w$ into the sum $w = w_1 + w_2$, where $w_2$ is the projection of $w$ onto the column-space of $A$. That is, $$ w_2 = Q_2Q_2^Tw. $$ Denote $v_1 = Q_1^Tw_1 = Q_1^Tw, v_2 = Q_2^Tw_2 = Q_2^Tw$. We have $$ Q^T(A + ww^T)Q = \pmatrix{Q_1 & Q_2}^T(A + ww^T)\pmatrix{Q_1 & Q_2}\\ = \pmatrix{ Q_1^T(A + ww^T)Q_1 & Q_1^T(A + ww^T)Q_2\\ Q_2^T(A + ww^T)Q_1 & Q_2^T(A + ww^T)Q_2} \\ = \pmatrix{ v_1v_1^T & v_1v_2^T\\ v_2v_1^T & D_2 + v_2v_2^T}. $$ Note that the specific value of $v_1$ depends on $Q_1$; in fact, it can be taken to be any vector with length equal to that of $w_1$ whose last $r-1$ entries are zero. A particularly nice choice, then, is to take $v_1$ to be the vector with a its length in the $r$th entry and zeros elsewhere. In order to achieve this, set the final column of $Q_1$ to be equal to the unit vector $\hat w_1 = w_1/\|w_1\|$. Accordingly, repartition $Q$ as $$ Q = \pmatrix{Q_1' & \hat w_1 & Q_2}. $$ In the resulting matrix $Q^T(A + ww^T)Q$ will have all of its non-zero entries in the bottom-right size $r$ square submatrix, and this submatrix is a rank-1 update of a diagonal matrix. It should suffice to apply the BNS formula to this submatrix.

Related Question