Just reduce it to row echelon form, keeping track of what you do. Start with $$
\begin{pmatrix}
1 & 2 & 3 & 1 &R_1\\
2 & -1 & 1 & -3 &R_2\\
1 & 3 & 4 & 1 &R_3\\
3 & -1 & 2 & -2 &R_4
\end{pmatrix}$$
Then
\begin{pmatrix}
1 & 2 & 3 & 1 &R_1\\
0 & -5 & -5 & -5 &R_2-2R_1\\
0 & 1 & 1 & 0 &R_3 -R_1\\
0 & -7 & -7 & -5 &R_4-3R_1
\end{pmatrix}
and continue in this fashion.
If $a_1,\dots, a_n$ are linearly dependant and $x$ is a linear combination of these vectors, then there are infinitely many such linear combinations. To see this, since these vectors are linearly dependant there are coefficients $\alpha_1,\dots,\alpha_n$ such that:
$ \alpha_1a_1 + \dots + \alpha_n a_n = 0 ~.$
Note that the above equation is true also if we multiply all the coefficients by some scalar $\alpha\in\mathbb{R}$ (this is just multiplying both sides of the equation by $\alpha$), so there are infinitely many ways to represent the zero vector as a linear combination of those vectors.
Now take some linear combination of $a_1,\dots,a_n$ that equal to $x$ (which exists by the assumption) and add the zero vector to it, which is also a linear combination of those vectors. Since there infinitely many ways to represent the zero vector as a linear combination of those vectors, then there are also infinitely many ways to represent the vector $x$ as a linear combination of those vectors.
This question is actually the same as asking how many solutions the following problem has: $Ax=y$ where $A$ is a matrix with columns $a_1,\dots,a_n$. The only possible answers are that either there are no solutions, there is a unique solution or there are infinitely many solutions.
Best Answer
The easiest way to be sure is to understand the proof of the theorem in question $-$ and no, I’m not being facetious. However, perhaps this will help.
Suppose that three of your four vectors, say $v_1,v_2$, and $v_3$, are linearly independent. Intuitively this says that $v_1$ and $v_2$ determine a plane $\Pi$, and $v_3$ is not in that plane. Now let $v$ be any vector in $\Bbb R^3$, and let $P$ be the point at its ‘head’. The line $\ell$ through $P$ parallel to $v_3$ must hit the plane $\Pi$, since it can’t be parallel to $\Pi$. (If it were, $v_3$ would be in that plane, and it’s not.) Let $Q$ be the point where $\ell$ intersects $\Pi$. $Q$ is the ‘head’ of some vector $u$ that must be a linear combination of $v_1$ and $v_2$, since it lies in the plane that they determine. And $P-Q$ is a multiple of $v_3$, since $\ell$ is parallel to $v_3$, so $v$ is a linear combination of $v_1,v_2$, and $v_3$. Thus, every vector $v\in\Bbb R^3$ is a linear combination of $v_1,v_2$, and $v_3$, and therefore no $v\in\Bbb R^3$ is independent of $v_1,v_2$, and $v_3$. (Of course this just says that $\{v_1,v_2,v_3\}$ is a basis for $\Bbb R^3$.)