Gram-Schmidt process is for orthogonalization (so that, given a set of already linearly independent vectors, you can construct an orthonormal basis).
If the set of original three vectors with $V$ you've chosen would have been linearly dependent, by substracting the projection of $V$ onto each of the original vectors you would get $0$.
The easiest way to find a fourth vector is to try all the $e_i$ vectors. As the original set is linearly independent, and its size is less than space dimension, at least one of $e_i$ is linearly independent of the original set.
You already found that $e_1$ is linearly independent of the original set, so you may use it as an answer.
First off, your method does and does not work, depending on what you mean. What if the vector (0, 0, 0, 1) is already in the span of the 3 vectors you started with? Then row-reduction will lead to a 0 column. Essentially, what you did was guess that (0, 0, 0, 1) was not already in the span, and then you checked to see if you were right. This will work exactly in the cases when (0, 0, 0, 1) was not already in the span of the vectors you started with. As lhf points out in the comments to the question, this method will work a lot of the time. In fact, it would work for this problem if you had guessed any of (1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), or (0, 0, 0, 1) (checked with Sage). But, it won't always work.
It is well known that the cross product of two vectors (in 3-dimensions) gives a new vector that is orthogonal to both of the starting two vectors. Therefore, if the first two vectors are linearly independent, and the new third one is orthogonal to both of them, then the set of 3 vectors is definitely linearly independent. By the way, if the first two are not linearly independent, then the cross product will be the 0 vector.
There is a generalization to the cross product that can be applied here. If in $n$-dimensions, use $n-1$ vectors, and the result will be a vector orthogonal to the $n-1$ vectors. So, to answer your question, one way would be to find the determinant:
$$\begin{vmatrix} i & j & k & l \\
1 & 2 & 0 & 2 \\
1 & 1 & 1 & 0 \\
2 & 0 & 1 & 3\end{vmatrix}$$
Assuming your first three vectors are linearly independent, the result will be a 4 dimensional vector (in terms of coordinates i, j, k, l) that is orthogonal to the 3 starting vectors.
I got $8i - j - 7k - 5l$, assuming I didn't make any mistakes. And, I checked in Sage that the 4 vectors would be linearly independent.
Reference: I learned about this when I took a 4th semester of calculus in college, where we used Vector Calculus by Susan Jane Colley. It is introduced in the exercises for Section 1.6. One of the exercises is to prove that the new vector is orthogonal to the previous ones.
Best Answer
Your reduced matrix is not in echelon form; you need to do one more stage with your row reduction. Then you'll see the fourth row will consist of all zeroes. So, then, the three columns of the row reduced form, and thus the three columns of the original matrix, are linearly independent.
To find the one vector $(a,b,c,d)$ that you need to complete a basis (the space has dimension four), you could start with the matrix $$\tag{1} \left[\matrix {1&1&2&a\cr2&1&0&b\cr 0&1&1&c\cr 2&0&3&d}\right]. $$ Row reduce this and in the end select $a,b,c,d$ so that the reduced form has independent columns (four pivots).
I did the row-reduction suggested above and obtained $$\tag{2} \left[\matrix {1&1&2&a\cr0&-1&-4&b-2a\cr 0&0&21&7b-14a+7c\cr 0&0&0&\color{maroon}{-20a+13b+7c-3d}}\right] $$ Assuming I made no errors, taking $a=1$ and $b=c=d=0$ in $(2)$ will give us a matrix with independent columns (we want $\color{maroon}{-20a+13b+7c-3d}\ne 0$). So we can take $(1,0,0,0)$ to be the forth vector needed to complete a basis.