Decompose a 2 x 2 matrix into projection matrices from its eigenvalues, eigenvectors

linear algebramatricesmatrix decomposition

Consider the matrix $B = \begin{bmatrix} 2 & 2 \\ 1 & 3 \end{bmatrix}$. Find projection matrices $P_1, P_2$ such that (1) $B = \lambda_1 P_1 + \lambda_2 P_2$ where $\lambda_1, \lambda_2$ are the eigenvalues of $B$, (2) $P_1 P_2 = 0$, and (3) $P_1 + P_2 = I_2$, the $2 \times 2$ identity. (Note: a projection matrix $P$ satisfies $P^2 = P$.

The eigenvalues are $\lambda_1 = 1, \lambda_2 = 4$ and eigenvectors $\begin{bmatrix} 2 \\ -1 \end{bmatrix}, \begin{bmatrix} 1 \\ 1 \end{bmatrix}$. The problem comes from this past QR exam – https://lsa.umich.edu/content/dam/math-assets/math-document/AIM/DELA/DELA_Sep18%20-%20Differential%20Eqns%20%26%20Linear%20Algebra%20Fall%202018.pdf – and I thought I could figure it out for practice, but I haven't been able to solve it. In particular, I'm not familiar with how to decompose a matrix into projection matrices using its eigenvalues. Any help or hints?

Best Answer

As I explain in the first case of this answer, $$P_1={A-\lambda_2 I\over\lambda_1-\lambda_2} \\ P_2 = {A-\lambda_1I\over\lambda_2-\lambda_1}.$$ One way to obtain this is to note that when expressed relative to the eigenbasis, the two projectors are simply $\operatorname{diag}(1,0)$ and $\operatorname{diag}(0,1)$. Perform a change of basis to the standard basis. Another way to derive these is to note, for instance, that if $\mathbf v_1$ and $\mathbf v_4$ are eigenvectors with eigenvalues $1$ and $4$, then $(A-4I)\mathbf v_1 = (1-4)\mathbf v_1$ and by definition $(A-4I)\mathbf v_4=0$. We want $P_1\mathbf v_1=\mathbf v_1$ and $P_1\mathbf v_2=0$, which we almost have with $A-4I$: we just have to divide by $3$ to make this the identity map on the span of $\mathbf v_1$.

Another approach that one doesn’t see as often comes up in this question: if $x$ and $y^*$ are right and left eigenvectors, respectively, of $A$ corresponding to the same simple eigenvalue, then the projector onto the right eigenspace (the span of $x$) is ${xy^*\over y^*x}$. (This looks a lot like the formula for orthogonal projection onto a vector.) You can find a derivation of this in the answer to that question. For example, a left eigenvector of $1$ for your matrix is $(-1,1)$ and a right eigenvector is $(-2,1)^T$, yielding $$P_1 = \frac13\begin{bmatrix}-2\\1\end{bmatrix}\begin{bmatrix}-1&1\end{bmatrix} = \frac13\begin{bmatrix}2&-2\\-1&1\end{bmatrix},$$ which matches the result of applying the formula at the top of this answer.