Do orthogonal projections play a role in diagonalizability

linear algebramatricesprojectionprojection-matrices

I'm studying Linear Algebra by myself, and the textbook I use is the fourth edition written by Friedberg, Insel, and Spence. For now, I'm trying to get through Section 6.6 that concerns orthogonal projections and the spectral theorem. The following claim embodied in this section really confuses me, and I'm not sure what theorem the authors apply to guarantee diagonalizability of $T$.

Let $V$ be a finite-dimensional inner product space, $W$ be a subspace of $V$, and $T$ be the orthogonal projection of $V$ on $W$. We may choose an orthonormal basis $\beta=\{v_1,\ldots,v_n\}$ for $V$ so that $\{v_1,\ldots,v_k\}$ is a basis for $W$. Then $[T]_\beta$ is a diagonal matrix with $1$'s as the first $k$ diagonal entries and $0$'s elsewhere.

I have no doubt about the existence of $\beta$; in fact, this can be guaranteed by Theorem 6.7. However, I don't know why the authors are confident to say that $[T]_\beta$ is diagonal. They even tell me the explicit form of this matrix representation. Does anyone have an idea? Thank you so much.

Best Answer

$\{v_1,...,v_k\}$ is an orthonormal basis of $W$ and $\{v_{k+1},...,v_n\}$ is an orthonormal basis of $W^{\perp}$ .

Furthermore we have $Tv=v$ for all $v \in W$ and $Tv=0$ for all $v \in W^{\perp}$ .

Hence

$Tv_j=v_j$ for $j=1,...,k$ and $Tv_j=0$ for $j=k+1,...,n$.