The Gram-Schmidt process does not change the span. Since the span of the two eigenvectors associated to $\lambda=1$ is precisely the eigenspace corresponding to $\lambda=1$, if you apply Gram-Schmidt to those two vectors you will obtain a pair of vectors that are orthonormal, and that span the eigenspace; in particular, they will also be eigenvectors associated to $\lambda=1$.
You can also see that the eigenvector corresponding to $\lambda=10$ is orthogonal to the other two eigenvectors, hence to the entire eigenspace they span. So the third eigenvector will already be orthogonal to the orthonormal basis you find for $E_{1}$. You'll just need to normalize it.
Note. You can always find an orthonormal basis for each eigenspace by using Gram-Schmidt on an arbitrary basis for the eigenspace (or for any subspace, for that matter). In general (that is, for arbitrary matrices that are diagonalizable) this will not produce an orthonormal basis of eigenvectors for the entire space; but since your matrix is symmetric, the eigenspaces are mutually orthogonal so the process will definitely work. In fact, a real matrix is orthogonally diagonalizable if and only if it is symmetric.
Your vectors are neither orthogonal ($x_2 \cdot x_3 = -1.5)$, nor normal.
The note to your question points to the choice, given a normal vector $n$, of choosing either $n$ or $-n$ as normal.
How to orthonormalize:
We could start with $x_1$ and keep it as first vector of the new base, $x_1' = x_1$.
We want to end up with normal vectors as well, so it is better to do the normalization now.
The second vector is the second given vector without the parts in direction of the first new base vector:
\begin{align}
x_1''
&= \frac{x_1'}{\sqrt{x_1' \cdot x_1'}} = \frac{x_1}{\sqrt{x_1 \cdot x_1}} \\
x_2' &= x_2 - (x_1'' \cdot x_2) x_1'' \\
x_2'' &= \frac{x_2'}{\sqrt{x_2' \cdot x_2'}} \\
x_3' &= x_3 - (x_1'' \cdot x_3) x_1''- (x_2'' \cdot x_3) x_2'' \\
x_3'' &= \frac{x_3'}{\sqrt{x_3' \cdot x_3'}} \\
\end{align}
Thus here
\begin{align}
x_1'' &= (0, 1/\sqrt{2}, 1/\sqrt{2}) \\
x_2' &= (3, -3/2, 3/2) \\
x_2'' &= \sqrt{\frac{2}{27}}(3,-3/2, 3/2) \\
x_3' &= (2/3, 2/3, -2/3) \\
x_3'' &= \frac{\sqrt{3}}{2} (2/3, 2/3,-2/3) = (1/\sqrt{3},1/\sqrt{3},-1/\sqrt{3})
\end{align}
For the third vector we removed the parts in direction of the new first and second vector.
Only $x_3''$ has negative third component, so we choose
$$
x_3''' = - x_3'' = (-1/\sqrt{3},-1/\sqrt{3},/\sqrt{3})
$$
Best Answer
Let $u_1=\begin{bmatrix}1\\0\\-1\\\end{bmatrix} ,u_2=\begin{bmatrix}2\\-1\\0\\\end{bmatrix} ,u_3=\begin{bmatrix}1\\2\\1\\\end{bmatrix}$. To find the required orthonormal basis $\{w_1,w_1,w_3\}$, first we have $$w_1=\frac{u_1}{\|u_1\|}=\begin{bmatrix}\frac{1}{\sqrt{2}}\\0\\-\frac{1}{\sqrt{2}}\\\end{bmatrix}.$$
Second, find $u_2-(w_1\cdot u_2)w_1$ as follows: $$u_2-(w_1\cdot u_2)w_1=\begin{bmatrix}2\\-1\\0\\\end{bmatrix}-\sqrt{2}\begin{bmatrix}\frac{1}{\sqrt{2}}\\0\\-\frac{1}{\sqrt{2}}\\\end{bmatrix}=\begin{bmatrix}1\\-1\\1\\\end{bmatrix}.$$ By taking the dot product, you can see that $w_1$ is orthogonal to the above vector: $$w_1\cdot[u_2-(w_1\cdot u_2)w_1]=w_1\cdot u_2-(w_1\cdot u_2)w_1\cdot w_1=0$$ since $w_1$ is an unit vector. So we can take $$w_2=\frac{u_2-(w_1\cdot u_2)w_1}{\|u_2-(w_1\cdot u_2)w_1\|}=\begin{bmatrix}\frac{1}{\sqrt3}\\-\frac{1}{\sqrt3}\\\frac{1}{\sqrt3}\\\end{bmatrix}.$$
Finally, find $u_3-(w_1\cdot u_3)w_1-(w_2\cdot u_3)w_2$ as follows: $$u_3-(w_1\cdot u_3)w_1-(w_2\cdot u_3)w_2=\begin{bmatrix}1\\2\\1\\\end{bmatrix}-0\cdot\begin{bmatrix}\frac{1}{\sqrt3}\\-\frac{1}{\sqrt3}\\\frac{1}{\sqrt3}\\\end{bmatrix}-0\cdot\begin{bmatrix}\frac{1}{\sqrt{2}}\\0\\-\frac{1}{\sqrt{2}}\\\end{bmatrix}=\begin{bmatrix}1\\2\\1\\\end{bmatrix}.$$ By taking the dot product, you can again see that $w_1$ and $w_2$ and is orthogonal to the above vector. So we can take $$w_3=\frac{u_3-(w_1\cdot u_3)w_1-(w_2\cdot u_3)w_2}{\|u_3-(w_1\cdot u_3)w_1-(w_2\cdot u_3)w_2\|}=\begin{bmatrix}\frac{1}{\sqrt6}\\\frac{2}{\sqrt6}\\\frac{1}{\sqrt6}\\\end{bmatrix}.$$