The Gram-Schmidt process does not change the span. Since the span of the two eigenvectors associated to $\lambda=1$ is precisely the eigenspace corresponding to $\lambda=1$, if you apply Gram-Schmidt to those two vectors you will obtain a pair of vectors that are orthonormal, and that span the eigenspace; in particular, they will also be eigenvectors associated to $\lambda=1$.
You can also see that the eigenvector corresponding to $\lambda=10$ is orthogonal to the other two eigenvectors, hence to the entire eigenspace they span. So the third eigenvector will already be orthogonal to the orthonormal basis you find for $E_{1}$. You'll just need to normalize it.
Note. You can always find an orthonormal basis for each eigenspace by using Gram-Schmidt on an arbitrary basis for the eigenspace (or for any subspace, for that matter). In general (that is, for arbitrary matrices that are diagonalizable) this will not produce an orthonormal basis of eigenvectors for the entire space; but since your matrix is symmetric, the eigenspaces are mutually orthogonal so the process will definitely work. In fact, a real matrix is orthogonally diagonalizable if and only if it is symmetric.
Your basic idea is right. However, you can easily verify that the vectors $u_1$ and $u_2$ you found are not orthogonal by calculating
$$<u_1,u_2> = (0,0,2,2)\cdot \left( \begin{matrix} 2 \\ 0 \\ -6 \\ -8 \end{matrix} \right) = -12-16 = -28 \neq 0$$
So something is going wrong in your process.
I suppose you want to use the Gram-Schmidt Algorithm to find the orthogonal basis. I think you skipped the normalization part of the algorithm because you only want an orthogonal basis, and not an orthonormal basis. However even if you don't want to have an orthonormal basis you have to take care about the normalization of your projections. If you only do $u_i<u_i,u_j>$ it will go wrong. Instead you need to normalize and take $u_i\frac{<u_i,u_j>}{<u_i,u_i>}$. If you do the normalization step of the Gram-Schmidt Algorithm, of course $<u_i,u_i>=1$ so it's usually left out. The Wikipedia article should clear it up quite well.
Update
Ok, you say that $v_1 = \left( \begin{matrix} 0 \\ 0 \\ 2 \\ 2 \end{matrix} \right), v_2 = \left( \begin{matrix} 2 \\ 0 \\ 2 \\ 0 \end{matrix} \right), v_3 = \left( \begin{matrix} 3 \\ 2 \\ -5 \\ -6 \end{matrix} \right)$ is the basis you start from.
As you did you can take the first vector $v_1$ as it is. So you first basis vector is $u_1 = v_1$ Now you want to calculate a vector $u_2$ that is orthogonal to this $u_1$. Gram Schmidt tells you that you receive such a vector by
$$u_2 = v_2 - \text{proj}_{u_1}(v_2)$$
And then a third vector $u_3$ orthogonal to both of them by
$$u_3 = v_3 - \text{proj}_{u_1}(v_3) - \text{proj}_{u_2}(v_3)$$
You did do this approach. What went wrong is your projection. You calculated it as
$$ \text{proj}_{u_1}(v_2) = v_2<u_1,v_2>$$
but this is incorrect. The true projection is
$$ \text{proj}_{u_1}(v_2) = v_2\frac{<u_1,v_2>}{<u_1,u_1>}$$
As I tried to point out, some textbooks will skip the division by $<u_1,u_1>$ in the explanation of Gram-Schmidt, but this is because in most cases you want to construct an orthonormal basis. In that case you normalize every $u_i$ before proceeding to the next step. Therefore $<u_i,u_i> = 1$ can be skipped.
So what you need to change is to divide by $<u_2,u_2> = 8$ in your projection.
Best Answer
Sorry I got mixed up earlier with the statement of the Real Spectral Theorem. It tells you that there exists an orthogonal basis for $\Bbb{R}^3$ consisting of eigenvectors of your matrix $A$ with all eigenvalues real. So indeed finding the correct vectors in the eigenspace to be orthogonal is not immediate from the outset.