[Math] Linear Independence and Subset Relations

linear algebra

I've been reading the wikibook on Linear Algebra and in the section 'Linear Independence and Subset Relations' it defines the following lemma:

Lemma 1.14:
Any subset of a linearly independent set is also linearly independent. Any superset of a linearly dependent set is also linearly dependent.

The following is my proof of the statement "Any subset of a linearly independent set is also linearly independent":
$$S= \{v_1, \dots, v_n\}\space |\space a_1\cdot v_1 +\space … +\space a_n\cdot v_n =0$$
Let $$S_i=S-\{v_i\}$$
$S_i$ is linearly dependant if $$\sum_{j=1,\space j\neq i}^n (b_j\cdot v_j)$$

That is:

$$\sum_{j=1}^n (b_j\cdot v_j) – 0\cdot v_i$$
Let $a_i = b_i$ and using the constraint in the definition of $S$ we conclude $S_i$ is linearly independent.


I'm uncertain about the validity/notation regarding my assignment of $a_i = b_i$ in the proof. Is there a more correct approach to this proof? I often find myself needing to make a set of 'placeholder' (for a lack of a better word) variables map to an equivalent set of variables and am concerned I'm doing it wrongly. The book's proof is simply 'This is clear'.


Second Attempt

Let S be a linearly independent set of unique vectors $v_1,…,v_n$ such that $n\in Z$ and $n\geq1$. Without a loss of generality let a set $U$ be a linearly dependent subset of $S$ such that $U= \{v_1,…,v_i \}$ for some $i<n$. Because $U$ is a linearly dependent set, the element $v_1$ of $U$ can be written as a linear combination of the other elements of $U$ where coefficient $b_1\neq0$ and there must be coefficients which satisfy the equation other than the trivial case of $b_2=\dots =b_i=0$.

$$ v_1=(-b_2/b_1 )v_2+\dots +(-b_i/b_1 )v_i $$

Given the independence of $S$, $v_1$ cannot be written as a non-trivial linear combination of vectors $v_1,\dots ,v_n$ such that coefficient $a_1\neq 0$ and not all coefficients $a_2,…,a_n$ are zero. That is:

$$v_1\neq(-a_2/a_1 )v_2+\dots +(-a_n/a_1 )v_n$$

Expanding out this equation we have:

$$v_1\neq(-a_2/a_1 )v_2+\dots +(-a_i/a_1 )v_i+(-a_n/a_1 )v_n$$

Using the dependence of $U$, which states $v_1$ be written as a linear combination of the other elements of $U$ (i.e $\{v_2,…,v_i\}$), where the vector coefficients are not all zero, we get:

$$v_1\neq(v_1)+(-a_n/a_1 )v_n$$

Now given that not all the coefficients of $v_2,\dots,v_i$, were zero we can have $a_n=0$ and still satisfy both conditions of the equation, namely that $a_1\neq 0$ and not all the coefficients are $0$. This yields
$$v_1\neq v_1$$
This cannot be true so therefore the assumption that $U$ is dependant must be wrong.

Best Answer

Your first attempt is completely wrong, I'm afraid. The second attempt contains a good idea that can be fixed to provide a proof.

Suppose $\{v_1,v_2,\dots,v_n\}$ is a linearly independent set and that (without loss of generality) $\{v_1,\dots,v_i\}$ is a linearly dependent subset. Then $i\ge1$, because the empty set is linearly independent. Thus, again without loss of generality, we can write $$ v_1=b_2v_2+\dots+b_iv_i $$ that can be also written $$ v_1=b_2v_2+\dots+b_iv_i+0v_{i+1}+\dots+0v_{n} $$ which is a contradiction.

However, it's better to use a different definition/characterization of linearly dependent sets: saying that $\{v_1,\dots,v_i\}$ is linearly dependent means that there exist scalars $a_1,\dots,a_i$ not all zero such that $$ a_1v_1+\dots+a_iv_i=0 $$ Now we can rewrite this as $$ a_1v_1+\dots+a_iv_i+a_{i+1}v_{i+1}+\dots+a_nv_n=0 $$ where $a_{i+1}=\dots=a_n=0$. The scalars $a_1,\dots,a_i,a_{i+1},\dots,a_n$ are not all zero, so the set $\{v_1,\dots,v_n\}$ is linearly dependent.

Related Question