[Math] Checking linear dependence of two vectors in $\mathbb{R}^4$

linear algebra

If we're given a set two vectors from $\mathbb{R}^4$, for example:
$$S=\{(3,-1,1,1),(1,3,-1,1)\}$$
and we want to check if they're linearly dependent or independent, is the following procedure a correct way to do this?

We form a matrix consisting of those two vectors like this:
\begin{bmatrix}
3 & -1 & 1 & 1 \\
1 & 3 & -1 & 1 \\
\end{bmatrix}
Then we transform that matrix to row echelon form by $II-3\cdot I$ transformation:
\begin{bmatrix}
3 & -1 & 1 & 1 \\
0 & \frac{10}{3} & -\frac{4}{3} & \frac{2}{3} \\
\end{bmatrix}
We can see that the rank of that matrix is $2$, so it has two linearly independent rows. Does that answer the question of linear independence of those two vectors? The reason I'm asking this is because my professor checked their linear independence by forming a matrix of this form:
\begin{bmatrix}
3 & 1 \\
-1 & 3 \\
1 & -1 \\
1 & 1 \\
\end{bmatrix}
Then reduced it to row echelon form:
\begin{bmatrix}
3 & 1 \\
0 & \frac{10}{3} \\
0 & 0 \\
0 & 0 \\
\end{bmatrix}
which is equal to the system:
$$3x+y=0$$
$$\frac{10}{3}y=0$$
so $x=0$ and $y=0$, which means those two vectors are independent.

Best Answer

Both methods are correct. To check if two vectors $a$ and $b$ are linearly independent, just check if $b$ is a scalar multiple of $a$. If the 4th component of both the vectors are equal, then in order for the two vectors to be linearly dependent all components must be equal. Hence, the two given vectors are linearly independent.

Alternatively, we can form a $2 \times 4$ matrix whose rows are $a$ and $b$. If the rank of this matrix is 2, then the two vectors are linearly independent. The rank of this matrix can be obtained by doing row operations, like you did. But the rank can also be obtained by doing column operations (which is equivalent to doing row operations on the transpose matrix, which is what your professor did). Both row and column operations preserve the rank of a matrix, and the maximum number of linearly independent rows of a matrix is equal to the maximum number of linearly independent columns of a matrix. Whichever method you find simpler for the problem at hand should be fine.