Linear Algebra – Projection Matrix of Eigenvector Space

linear algebramatricesprojection-matrices

I need to solve a problem about projection matrix and eigenspace
Below is the problem:

We have matrix

$A=\begin{pmatrix}2&0&0\\0&0&2\\0&2&0\end{pmatrix}$


Find projection matrix toward eigenvector space of A(vector space which consists of eigenvector basis).

I tried to figure out what projection matrix was.
First I got eigenvalues of -2 and 2.

Should I calculate

P=A(ATA)−1AT

twice to get projection matrix of eigenspace when eigenvalue =-2
and when eigenvalue=2

I will be grateful if someone help me find what is projection matrix of eigenvector space of A.

Thank you!

Best Answer

The eigenspace associated with $-2$ is $1$-dimensional, and the eigenspace associated with $2$ is $2$-dimensional. The first of these projectors is easy to calculate; all we need is a single eigenvector In this case, you should find that an eigenvector of $-2$ is given by $$ x = (0,1,-1)^\top. $$ Now, we just need the associated projection matrix, $\frac{xx^\top}{x^\top x}$.

For the second of these, we want a basis for the kernel of $$ A - 2I = \pmatrix{0&0&0\\0&-2&2\\0&2&-2}. $$ If you solve this via row reduction (or just "by inspection"), you end up with the eigenvectors $$ x_1 = (1,0,0)^\top, \quad x_2 = (0,1,1)^\top. $$ You could now take these to be the columns of a matrix $M$ and compute and compute the projection $P = M(M^\top M)M^\top$, but because this is already an orthogonal basis we can deduce that the projection matrix is simply the sum of the individual associated projection matrices, namely $$ P = \frac{x_1x_1^\top}{x_1^\top x_1} + \frac{x_2x_2^\top}{x_2^\top x_2} = \pmatrix{1&0&0\\0&0&0\\0&0&0} + \frac 12\pmatrix{0&0&0\\0&1&1\\0&1&1}. $$

Related Question