Let $V$ be an $n$-dimensional real inner product space and let $a=\lbrace v_1,v_2,\dots v_n \rbrace$ be an orthonormal basis for $V$. Let $W$ be a subspace of $V$ with orthonormal basis $B = \lbrace w_1, w_2,\dots w_k\rbrace$. Let $A = \lbrace [w_1]a, [w_2]a,\dots [w_k]a\rbrace$ and let $P_w$ be the orthogonal projection onto $W$.
Show $[P_w]aa = AA^t$.
- What I have is $P_w(x) = w$
- $[P_w(x)]a = [P_w]aa[x]a$
- And since $A$ is a orthogonal matrix $AA^t=I$
I'm stuck on this step
Please help me out. Thanks
Best Answer
The matrix $A$ takes a vector in the subspace $W$, considered as a copy of $\mathbb R^k$, expressed in the basis $B$, and returns the corresponding vector expressed in the basis $a$, as a member of $\mathbb R^n$.
The transpose of $A$ does the opposite: it takes a vector in $\mathbb R^n$, expressed in the basis $a$, and gives a vector that is intrinsic to the subspace $W$. This new vector is expressed in the basis $B$. This new vector, necessarily then, has lost any information about components outside of $W$.
So the transpose takes a general vector and projects it onto $W$, but you then only have that new vector expressed in terms of the the basis $B$. The original matrix $A$ converts any such vector back to the $a$ basis.
What you should show in order to prove this result is that any out-of-subspace components of an arbitrary input vector are reduced to zero as a result of this composition. You will need the images of the vectors $w_1, w_2, \ldots$ under $A^T$ to be able to decompose any arbitrary vector of $\mathbb R^n$ this way.