[Math] Prove that “Length of linearly independent list $\leq $ length of spanning list”

linear algebraproof-verificationvector-spaces

In Sheldon Axler's Linear Algebra Done Right Page 35, he gave a proof of "Length of linearly independent list <=length of spanning list"(see reference below) by using a process of adding a vector from independent list to the spanning list and then cancels a vector form the spanning list to form a new spanning list.

My question falls into the Linear Dependence Lemma part. Linear Dependence Lemma only tells us that we can remove some $v_j$ in the dependent list, but the proof he gives on the first picture says we can remove one of the $w's$, why not $u's$? How can he be so sure about that?

My current understanding:

Step 1: By the definition of spanning list, it's
easy to obtain that $u_1$ can be written as a linear combination of
the spanning list $w_1,…,w_n$ as $u_1=a_1w_1+…+a_nw_n$. Meanwhile,
the coefficients $a_1,…,a_n$ cannot all equals to $0$, or else
$u_1=0$, contradicting the presumption that $u_1,…,u_n$ is linearly
independent. Let $a_j\neq 0$, and thus we can rewrite
$u_1=a_1w_1+…+a_nw_n$ into
$w_j=-\frac{a_1}{a_j}w_1+…-\frac{a_{j-1}}{a_j}w_{j-1}-\frac{a_{j+1}}{a_j}w_{j+1}-…-\frac{a_n}{a_j}w_n+\frac{1}{a_j}u_1$,
implying $w_j\in span\{w_1,…,w_{j-1},w{j+1},…,w_n,u_1\}$, which is
why we can reduce some $w's$.

Step $j$($j$>=$2$), also the coefficients of in the form of $a's$
in $u_j=a_1w_1+…+a_{n-j+1}w_{n-j+1}+b_1u_1+…+b_{j-1}u_{j-1}$
cannot all equals to zero, or else contradicts the presumption that
$u's$ are linearly independent. This is why we can always remove some
$w's$ in each step.

Please verify my understanding, this does not seem obvious to me and it took time for it to figure it out. However, I'm not so sure that I'm right, so please point out any mistakes and substitute my understanding of the question with a better explanation, thanks in advance.

Reference 1:Proof of  "Length of linearly independent list <=length of spanning list"
Reference 2: Linear Dependence LemmaLinear Dependence Lemma

Best Answer

The main thing to notice is the order of writing the list.

Your step 1 said we can remove $u_1$ is false. Note that in Linear Dependence Lemma, it said there exists a vector $v_j$ in the list so it belongs to span of all previous vectors $v_1, \ldots, v_{j-1}$. Notice all $v_1, \ldots, v_{j-1}$ are at the left of $v_j$ in the list $(v_1, \ldots, v_m)$.

Hence, if we write the list as $$u_1,w_1, \ldots, w_n$$ in this exact order, we can't remove $u_1$ because all $w_i$ are at the right of $u_1$ in the list. Hence, even if $u_1$ is linear combination of $w_1, \ldots,w_n$, $u_1$ is still not the corresponding $v_j$ satisfying Linear Dependence Lemma.

Why can't we remove $u_i$?

All the steps give a general list $$u_1, \ldots, u_k,w_1, \ldots, w_l$$ that is linearly dependent. Linear Dependence Lemma said that in this this, there exists a vector $v$ so $v$ belongs to span of all previous vectors in the list. If this $v=u_i$ for some $i$ then $v \in \text{span}(u_1, \ldots, u_{i-1})$, which contradicts to condition that $(u_1, \ldots, u_k)$ is linearly independent. Thus, this $v$ must be different from any $u_i$. Thus, $v=w_i$ for some $i$. With this and condition (b) of the Linearly Dependent Lemma, we find that we can remove $v=w_i$. Thus, we can't remove $u_i$.