The theorem you have quoted is true but only tells part of the story. An improved version is as follows.
Let $U$ be a real $m\times n$ matrix with orthonormal columns, that is, its columns form an orthonormal basis of some subspace $W$ of ${\Bbb R}^m$. Then $UU^T$ is the matrix of the projection of ${\Bbb R}^m$ onto $W$.
Comments
- The restriction to real matrices is not actually necessary, any scalar field will do, and any vector space, just so long as you know what "orthonormal" means in that vector space.
- A matrix with orthonormal columns is an orthogonal matrix if it is square. I think this is the situation you are envisaging in your question. But in this case the result is trivial because $W$ is equal to ${\Bbb R}^m$, and $UU^T=I$, and the projection transformation is simply $P({\bf x})={\bf x}$.
You are right: $(1,1,1)$ is orthogonal to $V$. Therefore, $A.(1,1,1)=(0,0,0)$. Now, consider the vectors $(1,-1,0)$ and $(1,0,-1)$. Since they both belong to $V$, you must have $A.(1,-1,0)=(1,-1,0)$ and $A.(1,0,-1)=(1,0,-1)$.
Now, since$$(1,0,0)=\frac13(1,1,1)+\frac13(1,-1,0)+\frac13(1,0,-1),$$you must have$$A.(1,0,0)=\frac13(1,-1,0)+\frac13(1,0,-1)=\left(\frac23,-\frac13,-\frac13\right).$$So, the entries of the first column of the matrix of $\operatorname{proj}_V$ with respect to the standard basis will be $\frac23$, $-\frac13$ and $-\frac13$. Can you take it from here?
Best Answer
The projection formula you have is the solution. $\left(u,u\right)=1$, since $u$ is a unit vector. So \begin{align*} \text{Proj}_{U}v & =\left(v,u\right)u=\left(u^{T}v\right)u=\left(uu^{T}\right)v. \end{align*}