Linear Algebra – Length of Linearly Independent vs Spanning Lists

linear algebra

I am referring to Theorem 2.23 of the book is Linear Algebra Done Right by Axler. It mentions

Theorem: In a finite-dimensional vector space, the length of every linearly independent list of vectors is less than or equal to the length of every spanning list of vectors.

Proof

Suppose $u_1, u_2,…..,u_m$ is linearly independent in V. Suppose also that
$w_1,w_2,…,w_n$ spans V. We need to prove that $m \leq n$. We do so through the
multi-step process described below; note that in each step we add one of the $u$’s and remove one of the $w$’s.

Step 1

Let B be the list $w_1,w_2,…,w_n$, which spans V. Thus adjoining any vector
in V to this list produces a linearly dependent list (because the newly
adjoined vector can be written as a linear combination of the other
vectors). In particular, the list
$u_1,w_1,…,w_n$ is linearly dependent. Thus by the Linear Dependence Lemma (2.21),
we can remove one of the $w$’s so that the new list B (of length $n$)
consisting of $u_1$ and the remaining $w$’s spans V.

Step j

The list B (of length $n$) from step $j-1$ spans V. Thus adjoining any
vector to this list produces a linearly dependent list. In particular, the
list of length $n+1$ obtained by adjoining $u_j$ to B, placing it just after
$u_1,u_2,…,u_{j-1}$, is linearly dependent. By the Linear Dependence Lemma
(2.21), one of the vectors in this list is in the span of the previous ones,
and because $u_1,u_2,…,u_j$ is linearly independent, this vector is one of
the $w$’s, not one of the $u$’s. We can remove that $w$ from B so that the
new list B (of length $n$) consisting of $u_1,u_2,…,u_j$ and the remaining $w$’s
spans V.

I have problem with the part that states

By the Linear Dependence Lemma (2.21), one of the vectors in this list is in the span of the previous ones, and because $u_1,u_2,…,u_j$ is linearly independent, this vector is one of the $w$’s, not one of the $u$’s.

Why does linear independence of the list $u_1,u_2,…,u_j$ imply that one
of $u$'s cannot be written as a linear combination of the rest of $u$'s and $w$'s in the list?

What I can understand is that if the author said it must be possible to select one of $w$'s as otherwise, the $u$'s will end up being linearly dependent, then he'd be right. If you cannot choose any of the $w$'s and the list is known to be linearly dependent, then one of the $u$'s will end up being in the span of the rest of the $u$'s. This is not what he states though. He states it has to be one of $w$'s. I think that statement is wrong.

If I am making a mistake in the way I have understood the proof, please help me understand it correctly.

Best Answer

By the Linear Dependence Lemma (2.21), one of the vectors in this list is in the span of the previous ones

(emphasis added)

The author is not commenting on whether one of the $u$s can be written as a linear combination of the $w$s which follow it in the list. The Linear Dependence Lemma gives that one of the vectors in this list is the span of the ones preceding it in the list, i.e., one of the following things is true:

  • $u_2$ is in the span of $\{u_1\}$
  • $u_3$ is in the span of $\{u_1,u_2\}$
  • ...
  • $u_j$ is in the span of $\{u_1,\dots,u_{j-1}\}$
  • $w_1$ is in the span of $\{u_1,\dots,u_j\}$
  • $w_2$ is in the span of $\{u_1,\dots,u_j,w_1\}$
  • ...

Since the $u$s are linearly independent, it is impossible for any $u$ to be in the span of previous vectors on the list. So it must be a $w$ which is in the span of previous vectors on the list.