Given $A\in\mathbb{R}^{m\times n}$, $m\geq n$, compute the (economy) QR factorisation. This gives
$$
A = QR, \quad R\in\mathbb{R}^{n\times n}.
$$
Now if $\mathrm{rank}(A)<n$, the upper triangular matrix $R$ has a staircase profile with some of the "steps" of the staircase over more than one column. Select column indices $j_1,\ldots,j_k$ such that if you remove these columns from $R$, you obtain a nonsingular upper triangular matrix (you can consider it as making each step of the staircase of length 1). The columns $j_1,\ldots,j_k$ can be expressed as linear combination of the remaining columns.
Example: The red columns indicate the columns which are linear combinations of the others.
$$
\begin{bmatrix}
\times & \times & \color{red}\times & \times & \color{red}\times & \color{red}\times \\
0 & \times & \color{red}\times & \times & \color{red}\times & \color{red}\times \\
0 & 0 & \color{red}0 & \times & \color{red}\times & \color{red}\times
\end{bmatrix}
$$
Example: For the given matrix from the question, the QR factorisation is:
Q =
0 -0.4472 -0.8944
0 -0.8944 0.4472
-1.0000 0 0
R =
-1.0000 2.0000 -1.0000
0 4.4721 -2.2361
0 0 0
So one can pick the column 2 or 3 to make the matrix $R$ nonsingular and upper triangular (hence either the column 2 or 3 is a linear combination of the others).
Here's an answer that completely avoids determinants. Determinants are heinously overrated.
Let $\vec v_1,\vec v_2,\dotsc,\vec v_n$ be the columns of a matrix $A$. That is,
$$
A=
\begin{bmatrix}
\vec v_1 & \vec v_2 & \dotsb & \vec v_n
\end{bmatrix}
$$
Now, suppose the columns of $A$ are not linearly independent. Then there exist scalars $\lambda_1,\lambda_2,\dotsc,\lambda_n$ not all zero such that
$$
\lambda_1\vec v_1+\lambda_2\vec v_2+\dotsb+\lambda_n\vec v_n=\vec 0\tag{1}
$$
But (1) may be re-written in matrix form as
$$
\begin{bmatrix}
\vec v_1 & \vec v_2 & \dotsb & \vec v_n
\end{bmatrix}
\begin{bmatrix}
\lambda_1\\ \lambda_2 \\ \vdots\\ \lambda_n
\end{bmatrix}
=\vec 0
$$
Putting
$$
\vec \lambda=
\begin{bmatrix}
\lambda_1\\ \lambda_2 \\ \vdots\\ \lambda_n
\end{bmatrix}
$$
then gives $A\vec\lambda=\vec 0$ where $\vec \lambda\neq\vec 0$. Hence $A$ has a nontrivial nullspace and is thus not invertible.
Best Answer
Hint: Let $A \in \Bbb R ^{n \times m}$ and suppose $A$'s columns are linearly dependent, then show there is $0 \neq x \in \Bbb R^m$ such that $Ax = 0$. What does this tell you about $(A^TA)x$?