[Math] Eigenvector decomposition with inverse matrix

eigenvalues-eigenvectorsinverselinear algebra

Help! It's been almost 20 years since I did much linear algebra and I'm struggling to follow a derivation. In the following expression
$$\mathrm{y}' = A(A + \beta I)^{-1}\mathbf{y}$$
the real symmetric matrix $A$ has the eigenvector decomposition
$$A = \sum^n_{i=1} \lambda_i\mathbf{u}_i\mathbf{u}_i^T$$
Vector $\mathbf{y}$ can be represented as a linear combination of $A$'s eigenvectors:
$$\mathbf{y} = \sum^n_{i=1}\alpha_i \mathbf{u}_i$$
where $\alpha_i = \mathbf{u}_i^T \mathbf{y}$. This much I'm happy with, but what I don't understand is how one then gets to
$$\mathbf{y}' = \sum^n_{i=1} \frac{\lambda_i\alpha_i}{\lambda_i + \beta}\mathbf{u}_i$$
Can somebody offer a clear explanation for this step? Is there a way to show that
$$(A + \beta I)^{-1} = \sum^n_{i=1}\frac{1}{\lambda_i + \beta}\mathbf{u}_i\mathbf{u}_i^T$$
or is there something else going on?

Best Answer

I ended up finding out the answers for myself, so recording here for posterity.

Firstly, recall that one definition for an eigenvector $v$ with eigenvalue $\lambda$ of a matrix $A$ is that $$(A-\lambda I)v = 0$$ The eigenvectors provide an orthonormal set of vectors, so since $$\mathbf{y} = \sum^n_{i=1}\alpha_i \mathbf{u}_i$$

$$(A+\beta I)\mathbf{y} = A\mathbf{y}+\beta I\mathbf{y} = \sum_{i=1}^n\sum_{j=1}^n \alpha_j \lambda_i \mathbf{u}_i \mathbf{u}_i^T \mathbf{u}_j + \beta\sum_{i=1}^n\alpha_i \mathbf{u}_i$$

and since because of the orthonormal property $\mathbf{u}_i^T \mathbf{u}_j = 1$ for $i=j$ and 0 otherwise, this reduces to

$$(A+\beta I)\mathbf{y} = \sum_{i=1}^n \alpha_i(\lambda_i + \beta)\mathbf{u}_i$$

so $(A+\beta I)$ is equivalent to a matrix $A'$ with the same eigenvectors as in $A$, but with eigenvalues $\lambda_i' = \lambda_i + \beta$.

Secondly, as shown in this MSE question the eigenvectors of a matrix $M$ are the same as those for its inverse $M^{-1}$, with reciprocal corresponding eigenvalues. So $$M^{-1} = \sum_{i=1}^n \frac{1}{\lambda_i} \mathbf{u}_i \mathbf{u}_i^T$$ where $\mathbf{u}_i$ and $\lambda_i$ are the $i$th eigenvector and eigenvalue, respectively, of $M$. So using the above composite matrix $A'=A+\beta I$ for $M$, we have $$(A+\beta I)^{-1} = \sum_{i=1}^n \frac{1}{\lambda_i + \beta} \mathbf{u}_i \mathbf{u}_i^T$$

Related Question