Length of linearly independent list is less than or equal to length of spanning list

linear algebravector-spacesvectors

This is the proof of the theorem given in Sheldon Axler's Linear Algebra done right:

Proof of the theorem

The Linear Dependence lemma is:

Linear Dependence Lemma

In the proof you can see that in each step, one $u$ is added to the list of spanning vectors at the expense of one $w$, removed according to linear dependence lemma.

At the end,

After step $m$, we have removed all the $u$'s and the process stops. At each step, as
we add a $u$ to $B$, the Linear Dependence Lemma implies that there is some $w$
to remove. Thus there are at least as many $w$’s as $u$’s.

Here is my question: How can you guarantee that some $w$'s will remain in the spanning list $B$ waiting to be removed by linear dependence lemma (note that linear dependence lemma can only remove $w$'s, not $u$'s) till you are done with the $m^{th}$ step? Doesn't the proof already assume that the number of linearly independent vectors is less the number of spanning vectors?

Best Answer

Here is a slight modification of the argument that might help you in understanding Axler’s.

Since the span of a list does not depend on the ordering, we can reorder lists in any way we like.

I’ll show that if we have replaced $k$ vectors in the spanning list with the first $k$ vectors in the linearly independent list and $k<m$, we’re able to replace one more vector.

The case in which $m=0$ is trivial, so we can assume $m>0$. Up to a reordering of the spanning list, we have a spanning list $$ (u_1,\dots,u_k,w_{k+1},\dots,w_n) $$ and therefore $u_{k+1}=a_1u_1+\dots+a_ku_k+a_{k+1}w_{k+1}+\dots+a_nw_n$ for some scalars $a_1,\dots,a_n$. Note that one among $a_{k+1},\dots,a_n$ is nonzero, otherwise we’d get a contradiction. Up to a further reordering, we may assume $a_{k+1}\ne0$ and therefore in the spanning list $$ (u_1,\dots,u_k,u_{k+1},w_{k+1},\dots,w_n) $$ we can remove $w_{k+1}$ to get the spanning list $$ (u_1,\dots,u_k,u_{k+1},w_{k+2},\dots,w_n) $$

Objection: how do we know that we still have a $w$-vector available? Let’s see. May we have $k<m$ and have exhausted the $w$-vectors? No. This would mean that $(u_1,\dots,u_k)$ is spanning and so $u_{k+1}$ would be a linear combination of $u_1,\dots,u_k$: contradiction.