Alternative Solution to Theorem 5 Corollary 2, Section 2.3 of Hoffman’s Linear Algebra

alternative-prooflinear algebraproof-explanationvector-spaces

In this video lecture, time stamp 19:20 – 24:30. Professor show proof of following claim: (Extend to basis) Every linearly independent list of vectors in a finite dimensional vector space can be extended to a basis of the vector space.

I known one approach to prove this claim. Here is my attempt, but I still feel confuse on my proof. I would really appreciate if you give some feedback.

Apparently professor proved this claim using different approach (from I known). Professor Proof: Let $\{v_1,…,v_m\}$ be linearly independent. Let $\{w_1,…,w_n\}$ is basis of $V$. If $w_1\in \mathrm{span}(v_1,…,v_m)$, set $B_1=\{v_1,…,v_m\}$. If $w_1\notin \mathrm{span}(v_1,…,v_m)$, set $B_1=\{v_1,…,v_m,w_1\}$. $w_j\in \mathrm{span}(B_{j-1})$, do nothing. If $w_j\notin \mathrm{span}(B_{j-1})$, set $B_j=B_{j-1}\cup \{w_j\}$. At step $n$, $V=\mathrm{span}(B_n)$, $B_n$ is linearly independent, so basis.

I don’t know what professor did. You can check video(I have given time stamp) for complete context surrounding the proof. Proof seems extremely handwavy. In my proof, I constructed basis without any help of other(existing) basis, unlike $\{w_1,…,w_n\}$. Of course I proved it for subspace, but proof is essentially same for any vector space, so no modification required. Please help me in completing details of professor proof. I will have two approach in my arsenal to prove this claim(extend to basis).

Best Answer

An explicit construction of a list of sequences $B_0,B_1,\ldots,B_n$ is given (while it is not written, $B_0=[v_1,\ldots,v_m]$ is the initial sequence. (Also sequences are written as sets, which they are not; this is just sloppy.) The crux of the proof is to show by induction on $j$ that one has two properties: (1) $B_j$ is an independent sequence of vectors, and (2) $\def\Sp{\operatorname{Span}} \forall i\leq j:w_i\in\Sp(B_j)$. The passage from $j-1$ to $j$ is quite easy when $w_j\in\Sp(B_{j-1})$, and therefore (by construction) $B_j=B_{j-i}$: all you need is distinguish for (2) the cases $i<j$ and $i=j$. The other case ($w_j\notin\Sp(B_{j-1})$ so $B_j=\text{append}(B_{j-1},w_j)$) is only harder for statement (1), where one needs to use that appending to a linearly independent sequence of vectors not in their span always gives another linearly independent sequence of vectors, but that is an important fact that can be proved directly from the definition.

So if one accepts that these properties holds, one obtains for $B_n$ that it is a linearly independent sequence of vectors whose span contains as element each of the $w_i$. But since the span of those $w_i$ is the whole vector space$~V$, the span of $B_n$ must be at least as large, and therefore also all of$~V$. Being both linearly independent and spanning$~V$, the sequence $B_n$ is a basis of$~V$.

Related Question