Notice that $v_{1} = v_{3} - v_{2}$. So $\{v_{1}, v_{4}\}$ is your basis, as you can form $v_{2}$ and $v_{3}$ from linear combinations of $v_{1}, v_{4}$.
Edit: I'll add a bit more on a basis. Note that a basis is a maximally independent set of vectors that spans the space. In $\mathbb{R}^{3}$, we would have three basis vectors. For subspaces of any vector space, we are permitted fewer vectors. A subspace is a vector space itself, so the number of basis vectors describes the dimension of the subspace.
Now in general vector spaces, we can use the determinant test to see if a set of vectors spans. The determinant test is nice, because (for a matrix $M$) $det(M) = 0$ if and only if the set of vectors in $M$ are linearly dependent. So if $det(M) \neq 0$ and $dim(M) = dim(V)$, for $V$ your vector space, then the vectors in $M$ form a basis.
In a subspace, you generally have fewer than $dim(V)$ vectors. I say generally, as a vector space is trivially a subspace of itself. So you can row-reduce a matrix to find the number of independent vectors. Or you can solve systems of linear equations yourself. If you are given vectors, these are your two options. If you are choosing your own basis without such constraints, selecting a subset of the standard basis is the way to go.
Your work is correct but incomplete. You should show that those two vectors, $\langle 1, 0, 1, 1 \rangle$ and $\langle 0, 1, 1, -1 \rangle$ are linearly independent, as you have only shown that they span $W$. This is not too hard, as they have zeroes in different positions.
Indeed, let $a\langle 1, 0, 1, 1 \rangle + b\langle 0, 1, 1, -1 \rangle = 0$. Then
$$0 = \langle a, 0, a, a \rangle + \langle 0, b, b, -b \rangle = \langle a, b, a + b, a - b \rangle,$$
so $a = b = 0$ and they are linearly independent, and so $\dim W = 2$. Also, I'd like to point out that saying the basis is bad usage. There will be many many bases for $W$. For example, scale up both vectors by $2$. It would be more proper to say a basis.
Best Answer
As the third vector is linearly dependent forget it, and use Gram-Schmidt only on $x_1$ and $x_2$.
If you don't want to calculate that much, you can use the fact that the span of your subspace is the same as that of $$\begin{pmatrix} 0 \\ 1 \\ 0\\ \end{pmatrix} \qquad \begin{pmatrix} 1 \\ 0 \\ 1\\ \end{pmatrix}$$ Those vectors are already orthogonal (with the standard scalar product). Just normalise them and you will be fine.