[Math] Eigenvalue of the sum of a symmetric matrix and the outer product of it’s eigenvector

determinanteigenvalues-eigenvectorslinear algebranumerical methodsspectral-theory

I have a symmetric matrix $A$ with eigenpairs $(\lambda_k, v_k)$ with $k \in (1,..,n)$. A new matrix $B$ is made from an eigenpair $(\lambda_i, v_i)$ like this:
$$B = A – \lambda_i v_i v_i^T$$
where $\lambda_i$ is the eigenvalue with the largest magnitude (from power method, but not significant for this question). I need to prove that $\lambda_k$ for $k \neq i$ are eigenvalues of $B$. Anyone know how to do that?

Figured I'd use a matrix decomposition of orthogonal eigenvectors of $A$ for eigenvectors of $B$, but $v_i$ is the only eigenvector that carries over (with eigenvalue 0), the remaining eigenvectors are different from $v_j$ $(B v_j = Av_j – \lambda_i v_i v_i^Tv_j = Av_j)$. For an eigenpair $(\mu, w)$ for $B$ that was decomposed like $w = \sum b_k v_k$ with $b_i = 0$, we would then have $B w = A\sum b_k v_k – \lambda_i b_i v_i = \sum \lambda_k b_k v_k = \mu \sum b_k v_k$. Not sure how I can then use this to prove that $\lambda_k$ for $k \neq i$ are eigenvalues for $B$, so this may be a wrong way.

I thought about also proving $det(B – \lambda_k I) = 0$, but not sure how I would do that.

For context, the point of all this is to then use the power method on $B$ so that I can find the $m$ largest eigenvalues of $A$ by constructing new matrixes $B$ where the dominant eigenvalue is replaced by a $0$.

Best Answer

The action of $B$ on $v_j$ is $$ Bv_j = (A - \lambda_i v_iv_i^T)v_j = Av_j - \lambda_i v_i v_i^T v_j $$ Since the eigenvectors are orthogonal, then $v_i^Tv_j = \delta_{ij}$, assuming the eigenvectors are suitably normalized. For $j\ne i$, then $Bv_j = A v_j = \lambda_j v_j$. Therefore $\lambda_j$ is an eigenvalue of $B$ when $j\ne i$, and the corresponding eigenvector is the same $v_j$.

When $j=i$, then $$ Bv_i = Av_i - \lambda_i v_i = \lambda_i v_i - \lambda_i v_i = 0 $$ so $\lambda_i$ of $B$ is zero.