All you have to do is row reduce! Put everything into one matrix, and get rid of as many rows as you can.
Logically, in a basis, first of all there cannot be the zero vector. Secondly, no vector in a basis can be a linear combination of any other vectors. In your example, $X_4 = X_3 + X_2$, so you can leave out $X_4$. You are left with your basis: $X_2, X_3$
If you want to row reduce with a more complex question, you have the following matrix:
$\left[ \begin{array}{cccc} 0 & 0 & 0 & 0 \\ 0 & 0 & 1 & 1 \\ 1 & 1 & 0 & 0 \\ 1 & 1 & 1 & 1 \end{array} \right]$
Move the zero row to the bottom for convenience.
$\left[ \begin{array}{cccc} 0 & 0 & 1 & 1 \\ 1 & 1 & 0 & 0 \\ 1 & 1 & 1 & 1 \\ 0 & 0 & 0 & 0 \end{array} \right]$
Now subtract the first row + the second row from the third row.
$\left[ \begin{array}{cccc} 0 & 0 & 1 & 1 \\ 1 & 1 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \end{array} \right]$
We cannot row reduce anymore. Your answer is the rows that are not completely 0's. The first row, which maps to $X_2$, and the second row, which maps to $X_3$, is your basis. Remember, for the basis, you should use the original vectors, so the original $X_2$ and $X_3$ (coincidentally they are the same this time).
So your basis is $X_2, X_3 = \begin{bmatrix} 0 \\ 0 \\ 1 \\ 1\end{bmatrix},\begin{bmatrix} 1 \\ 1 \\ 0 \\ 0 \end{bmatrix}$
You need $a_1u_1$ or else you won't get the right first coefficient. The only way to get $a_2$ in the second coefficient is by taking $u_2$, but if you take $a_2u_2$ you'll get $a_1+a_2$ in the second coefficient, which is too many. Hence, you must take $(a_2-a_1)u_2$, which will give you just $a_2$ in the second coefficient when added to $a_1$ from the $u_1$ previously.
Continuing in this way you will get $$a_1u_1+(a_2-a_1)u_2+(a_3-a_2)u_3+(a_4-a_3)u_4$$
Best Answer
Hint: If the linear combination is not unique (so there exist two sets of coefficients $c_1,c_2,c_3$ and $d_1,d_2,d_3$), show that this implies the vectors in $S$ are linearly dependent. Then show that this is not the case.