A bit of confusion about linearly independent vectors

linear algebralinear independencematrices

Suppose I have those vectors

$$v = (1, 0, 1) \qquad u = (2, -1, 0) \qquad w = (0, 0, 1) \qquad s = (2, 1, 1)$$

Now those are vectors in $\mathbb{R}^3$.
If I want to study their linearl independence I can write down the associated matrix, that is

$$M= \begin{pmatrix}
1 & 0 & 1 \\
2 & -1 & 0 \\
0 & 0 & 1 \\
2 & 1 & 1
\end{pmatrix}
$$

Through Gauss elimination or the criterion of minors, I can state that the rank is three.

What confuses me now is that the notes say "vectors are linearly independent if the rank of the associated matrix is equal to the number of the unknown".

But again then it says "four or more vectors in $\mathbb{R}^3$ are linearly dependent."

So now I am confused, for the rank of the matrix is three, as the unknown. Yet I have four vectors.
Can somebody please explain me limpidly this?

Also, another question: suppose I build the associated matrix where the vectors are the columns and not the rows. Here the rank of the matrix is three, the maximum. But if the rank is maximum then they are linearly independent.

What is the difference of creating the row matrix and the column matrix in this sense?

Thank you!

Best Answer

Doesn't your book tell you what it calls "the associated matrix"?

Its sentence "vectors are linearly independent if the rank of the associated matrix is equal to the number of the unknown" is correct with the usual notion of a matrix associated to $n$ vectors: it has $n$ columns (in your example, each of your 4 vectors should be written vertically, and the associated matrix is the transpose of your $M$).

I don't follow your "if the rank is maximum then they are linearly independent". The rank of the matrix $M^T$ (of 4 column vectors) is equal to the rank of this family of 4 vectors, i.e. the dimension of the subspace they span. Since this rank of $M^T$ (equal to the rank of $M$) is strictly less than this number of vectors, they are dependent.

Related Question