Your solution works. Here's a way to approach the question systematically:
suppose you have two linearly independent vectors $v_1,v_2$. We know that if we include the vectors $e_1,e_2,e_3,e_4$ (that is, the standard basis vectors), then the set $\{v_1,v_2,e_1,e_2,e_3,e_4\}$ spans $\mathbb R^4$. We would like to extract a basis including $v_1$ and $v_2$ from this spanning set. In order to do so, put the matrix
$$
\begin{bmatrix}
| &| &| &| &| &| \\
v_1&v_2&e_1&e_2&e_3&e_4\\
| &| &| &| &| &|
\end{bmatrix}
$$
into reduced row echelon form.
For your example, this matrix is
$$
\left[
\begin{array}{cccccc}
0 & 1 & 1 & 0 & 0 & 0 \\
0 & 1 & 0 & 1 & 0 & 0 \\
1 & 1 & 0 & 0 & 1 & 0 \\
1 & 1 & 0 & 0 & 0 & 1 \\
\end{array}
\right]
$$
The resulting rref matrix is
$$
\left[
\begin{array}{rrrrrr}
1 & 0 & 0 & -1 & 0 & 1 \\
0 & 1 & 0 & 1 & 0 & 0 \\
0 & 0 & 1 & -1 & 0 & 0 \\
0 & 0 & 0 & 0 & 1 & -1 \\
\end{array}
\right]
$$
Now, we select only the vectors whose columns correspond to pivots in the rref form. That is, we select $\{v,w,e_1,e_3\}$. We now have a basis of $\mathbb R^4$ including $v$ and $w$.
Note that the answer you receive changes depending on the order in which you place the vectors in the matrix. However, as long as you place your desired linearly independent subset first, this process is guaranteed to give you a valid basis containing that subset.
Note also that systematic approaches are only particularly useful if you don't have strokes of insight. If you have a set of vectors of which you would like to check linear independence, put vectors as columns vectors in a matrix. Then, take the determinant or row reduce. The vectors are independent iff the resulting rref is the identity matrix, which is true iff the resulting determinant is zero.
Verification of asker's solution:
First of all, you could check that the solution $\{v,w,e_1,e_4\}$ is received by row reducing the matrix
$$
\begin{bmatrix}
| &| &| &| &| &| \\
v_1&v_2&e_1&e_2&e_4&e_3\\
| &| &| &| &| &|
\end{bmatrix}
$$
Setting that aside, there are two good methods of checking that a set of vectors is linearly independent, using (in this case) the matrix
$$
M = \begin{bmatrix}
| &| &| &| \\
v_1&v_2&e_1&e_4\\
| &| &| &|
\end{bmatrix}
=
\left[
\begin{array}{cccc}
0 & 1 & 1 & 0 \\
0 & 1 & 0 & 0 \\
1 & 1 & 0 & 0 \\
1 & 1 & 0 & 1 \\
\end{array}
\right]
$$
Method 1: row reduce
Take the matrix $M$, and find its reduced row echelon form. In this case, that comes out to
$$
\begin{bmatrix}
1&0&0&0\\
0&1&0&0\\
0&0&1&0\\
0&0&0&1
\end{bmatrix}
$$
Verifying that the matrix has full rank, which means that the set is indeed linearly independent.
Along the same lines, you could row reduce until you have an upper triangular matrix. If the diagonal entries are non-zero, then your matrix is invertible.
Method 2: take the determinant
Now, there are several methods to computing the determinant of a matrix, and I'm not going to go too far into that. Suffice it to say that in this case, the determinant comes out to $-1$. Since this result is non-zero, we conclude that the matrix has full rank, which means that its column vectors are linearly independent.
Best Answer
You know that $\mathbb{R}^4/\langle v\rangle$ has dimension $3$, don't you?
So, since $\{v,e_1,e_2,e_3\}$ is clearly a basis of $\mathbb{R}^4$, the cosets $[e_1]$, $[e_2]$ and $[e_3]$ form a spanning set of the quotient space. Hence they are linearly independent.
For the second part, suppose $$ \alpha[a]+\beta[b]+\gamma[c]=[0]. $$ This means $$ \alpha a+\beta b+\gamma c=\delta v $$ for some $\delta$. Can you go on?