Hint; your final vectors are not correct. The point of GS it to get an orthogonal set of vectors. Are yours orthogonal? You are starting off with two non orthogonal vectors , that is
$v_1=( 1 , 1 , 1)$ and $v_2= ( 1 , 2 ,1)$
The GS algorithm proceeds as follows;
let $w_1=(1,1,1)$
then we define $$w_2= v_2- \frac{\langle v_1 , w_1 \rangle}{\langle w_1 , w_1 \rangle} w_1$$
$$w_2=(1,2,1)-(4/3,4/3,4/3)=(-1/3,2/3,-1/3)$$
and it can be shown now that the set
$$S=\{w_1,w_2\}$$ is orthogonal and also spans the same subspace as the original vectors v.
If we normalize S to say $$S_n=\{(1/3,1/3,1/3),(\frac{-1}{\sqrt6},\sqrt{\frac{2}{3}},\frac{-1}{\sqrt6})\}$$
In general to find the projection matrix P, you first consider the matrix A with your vectors from $S_n$ as columns, that is $$A=\begin{bmatrix} 1/3 & \frac{-1}{\sqrt6} \\ 1/3 & \sqrt{\frac{2}{3}} \\ 1/3 & \frac{-1}{\sqrt6} \\ \end{bmatrix}$$
that is, we will have the orthogonal projection matrix equal to,
$P=A(A^{T}A)^{-1}A^{T}$
Your approach is fine. If you don't wish to use matrices, here is an alternative.
Suppose $ V $ is finite dimensional over $ F $ and $ W $ a subspace of $ V $ such that $ \dim_F(V) = \dim_F(W) $. Then $ V=W $. Perhaps the quickest way to see this is by quotients, $$ \dim_F(V/W) = \dim_F(V) - \dim_F(W) = 0 $$ meaning $ V/W = (\bar{0}) $ or $ V = W $.
You have $ \dim_{\mathbb{R}}(\mathbb{R}^n) = n $. $ W $ is spanned by an orthogonal set $ S $ of $ n $ vectors. This set is linearly independent. This means $ S $ is a basis for $ W $, so $ \dim_{\mathbb{R}} (W) = n $. By the above, $ V = W $.
Best Answer
There's a quick formula that you can use. Some explanations to guide you to it. We're working on a finite-dimensional vector space. You know that $\mathbb{R}^3=W\oplus W^\perp$. Since $(u_1,u_2)$ is an orthogonal basis of $W$, if you take $u_3$ any nonzero element in $W^\perp$, then $(u_1,u_2,u_3)$ is an orthogonal basis of $\mathbb R^3$. So there are $\alpha_1,\alpha_2,\alpha_3\in\mathbb R$ such that $$v=\alpha_1u_1+\alpha_2u_2+\alpha_3u_3\tag{1}$$
What is the orthogonal projection of $v$ onto $W$? Well, $v$ can be uniquely written as $v=x+y$ where $x\in W$ and $y\in W^\perp$, right? The orthogonal projection of $v$ onto $W$ is simply $x$, which is the vector $\alpha_1u_1+\alpha_2u_2$ by $(1)$. Two things to motivate this: Firstly, as $v$ can be written in a unique manner as a sum of two orthogonal vectors, one of which is in $W$, it is natural to give this definition. Secondly, you can actually show that $x$ is the unique vector that satisfies $$\|v-x\|=\inf_{w\in W}\|v-w\|\tag{2}$$ so $(2)$ gets us back to the orthogonal projection we used in in high-school ;)
So, using $(1)$, the orthogonal projection of $v$ is $\alpha_1u_1+\alpha_2u_2$. Using inner products, can you find what are $\alpha_1$ and $\alpha_2$? Don't forget that $u_1,u_2$ and $u_3$ are orthogonal ;)