The Gram-Schmidt process does not change the span. Since the span of the two eigenvectors associated to $\lambda=1$ is precisely the eigenspace corresponding to $\lambda=1$, if you apply Gram-Schmidt to those two vectors you will obtain a pair of vectors that are orthonormal, and that span the eigenspace; in particular, they will also be eigenvectors associated to $\lambda=1$.
You can also see that the eigenvector corresponding to $\lambda=10$ is orthogonal to the other two eigenvectors, hence to the entire eigenspace they span. So the third eigenvector will already be orthogonal to the orthonormal basis you find for $E_{1}$. You'll just need to normalize it.
Note. You can always find an orthonormal basis for each eigenspace by using Gram-Schmidt on an arbitrary basis for the eigenspace (or for any subspace, for that matter). In general (that is, for arbitrary matrices that are diagonalizable) this will not produce an orthonormal basis of eigenvectors for the entire space; but since your matrix is symmetric, the eigenspaces are mutually orthogonal so the process will definitely work. In fact, a real matrix is orthogonally diagonalizable if and only if it is symmetric.
Hint; your final vectors are not correct. The point of GS it to get an orthogonal set of vectors. Are yours orthogonal? You are starting off with two non orthogonal vectors , that is
$v_1=( 1 , 1 , 1)$ and $v_2= ( 1 , 2 ,1)$
The GS algorithm proceeds as follows;
let $w_1=(1,1,1)$
then we define $$w_2= v_2- \frac{\langle v_1 , w_1 \rangle}{\langle w_1 , w_1 \rangle} w_1$$
$$w_2=(1,2,1)-(4/3,4/3,4/3)=(-1/3,2/3,-1/3)$$
and it can be shown now that the set
$$S=\{w_1,w_2\}$$ is orthogonal and also spans the same subspace as the original vectors v.
If we normalize S to say $$S_n=\{(1/3,1/3,1/3),(\frac{-1}{\sqrt6},\sqrt{\frac{2}{3}},\frac{-1}{\sqrt6})\}$$
In general to find the projection matrix P, you first consider the matrix A with your vectors from $S_n$ as columns, that is $$A=\begin{bmatrix} 1/3 & \frac{-1}{\sqrt6} \\ 1/3 & \sqrt{\frac{2}{3}} \\ 1/3 & \frac{-1}{\sqrt6} \\ \end{bmatrix}$$
that is, we will have the orthogonal projection matrix equal to,
$P=A(A^{T}A)^{-1}A^{T}$
Best Answer
Take the set of vectors (say there are $k$ of them), augment it with the standard basis, apply the gram schmidt process by first going through the vectors in the set you started with then continue with the standard basis. Then, the original set of vectors and the non-zero vectors you get after the first $k$ form a basis for the whole space. It is orthogonal by the gram schmidt process since the first $k$ are orthogonal and the remainder are also orthogonal by gram schmidt. If you get a vector which is $0$ during the gram schmidt process, drop it from the process and continue until you get 4 vectors (which is the # of elements in a basis)
That is, you start with $\{v_1, v_2, e_1, e_2, e_3, e_4\}$. Applying Gram Schmidt to this will give you $\{v_1,v_2, a,b\}$