[Math] Find an ordered basis of $V$ such that $[T]_\beta$ is a diagonal matrix.

linear algebraproof-verification

The entire problem statement is:

Let $V$ be a finite dimensional vector space and $T:V\to V$ be the projection of $W$ along $W'$, where $W$ and $W'$ are subspaces of $V$. Find an ordered basis $\beta$ of $V$ such that $[T]_\beta$ is a diagonal matrix.

The book I took this problem from is Linear Algebra by Friedberg, Insel, & Spence.

My attempt of the proof is as follows:

We have that $V=W\oplus W'$, so let $\beta_1=\{v_1,\dots,v_k\}$ and $\beta_2=\{v_{k+1},\dots,v_n\}$ be bases for $W$ and $W'$, respectively. Then we have that $\beta_1\cap\beta_2=\varnothing$ and $\beta=\beta_1\cup\beta_2$ is a basis for $V$. Now to determine the matrix representation of $T$ in the ordered basis $\beta$ consider $T(v_j)$ for $1\leq j\leq k$,
\begin{align}
T(v_j)&=\sum_{i=1}^ka_{ij}v_i\nonumber\\
&=1\cdot v_j\nonumber
\end{align}
The last equality occurs since $\beta_1$ is a basis for $W$ and hence linearly independent. Considering $[T]_\beta$ as a block matrix, it follows from above that the sub-matrix composed of the first $k$ rows and $k$ columns is the identity matrix of size $k$, that is, $I_k$. Moreover, since $T$ is the projection of $W$ along $W'$, the entries of $[T]_\beta$ can be entirely specified using only $\beta_1$. Hence, the entries outside of the $k\times k$ sub-matrix are zero. Thus we have that
$$
[T]_\beta=\left(
\begin{array}{c|c}
I_k & 0 \\ \hline
0& 0\\
\end{array}\right)
$$
where the off diagonal zero matrices have size $(n-k)\times k$ and $k\times(n-k)$. Also, the diagonal zero matrix has size $(n-k)\times (n-k).$

My main concern is when I stated that $V=W\oplus W'$, which then allowed me to conclude that $\beta=\beta_1\cup\beta_2$ was a basis for $V$. The reason I'm assuming this is because the definition provided for the projection $T$ on $W$ along $W'$, it is assumed that $V$ is the direct sum of $W$ and $W'$.

Thanks for any help or feedback!

Best Answer

Well, you are correct that $V=W\oplus W'$, but a much simpler approach is to use the fact that for any projection we have $T^2=T$ so that $T(T-I)=0$. So in general we then have that the minimal polynomial of $T$ is $m_T(x)=x(x-1)$ (the special "trivial" cases being $T= 0$ so that $W=0$ and $T = I$ so that $W'=0$). Since this polynomial consist only of linear factors, $T$ is diagonalizable with eigenvalues $1$ and $0$.

Now we have $T(w)=w$ if and only if $w \in W$, so that the eigenspace associate with $1$ is $W$. Similarly we have $T(w')=0$ If and only if $w' \in W'$ so that the eigenspace associated with $0$ is $W'$. So any basis $\beta=\beta_W \cup \beta_{W'}$ where $\beta_W$ and $\beta_{W'}$ are bases for $W$ and $W'$ respectively is such that $[T]_\beta$ is diagonal.