[Math] Linearly independent

linear algebra

Let $S$ be a linearly independent subset of a vector space $V$, and let $v$ be a vector in $V$ that is not in $S$. Then $S\cup \{v\}$ is linearly dependent if and only if $v\in span\{S\}$.

proof)
If $S\cup \{v\}$ is linearly dependent then there are vectors $u_1,u_2,\dots,u_n$ in $S\cup \{v\}$ such that $a_1u_1+a_2u_2+\dots+a_nu_n=0$ for some nonzero scalars $a_1,a_2,\dots,a_n$.
Because $S$ is linearly independent, one of the $u_i$'s, say $u_1$ equals $v$.

[ADDITION]
The last part of the proof is this: Because S is linearly independent, one of the $u_i$'s, say $u_1$ equals $v$. Thus $a_1v+a_2u_2+\dots+a_nu_n=0$ and so $v$ can be written as a linear combination of $u_2,\dots,u_n$ which are in $S$. By definition of span, we have $v\in span(S)$.

I can't understand the last sentence. I think since $S$ is linearly independent and $S\cup \{v\}$ is linearly dependent so consequently $v$ can be written as the linear combination of $u_1,u_2,\dots,u_n$. But it has any relation to that sentence?

(+) I also want to ask a simple question here.
Any subset of a vector space that contains the zero vector is linearly dependent, because $0=1*0$. But that shows it holds when there is only one vector, zero vector, and the coefficient $a_1=1$.
Then it still holds when there are other nonzero vectors in a vector space?

Best Answer

Since $S\cup\{v\}$ is linearly dependent and $S$ is linearly independent, then you are correct that $v$ can be written as a linear combination of vectors in $S$. That's what we're trying to show, though, so we can't use that as a fact. Instead, we'll need to use the definition of linear dependence: that there is a collection of distinct vectors $u_1,u_2,...,u_n\in S\cup\{v\}$ and a collection of non-zero scalars $a_1,a_2,...,a_n$ such that $$a_1u_1+a_2u_2+\cdots+a_nu_n=0.\tag{1}$$ Since the $a_i$s are non-zero and $S$ is linearly independent, then we can't have all the $u_i$s in $S,$ so one of them is $v$. Without loss of generality (we can always reindex if we need to), say $u_1=v$, and so the rest are in $S$. Since $a_1\neq 0,$ we can solve $(1)$ to get $$v=u_1=-\frac1{a_1}\left(a_2u_2+\cdots+a_nu_n\right)\in\text{span}\{u_2,...,u_n\}\subseteq\text{span }S.$$


Note that the definition of linear dependence I used doesn't require that we use all of the vectors in $S\cup\{v\}$ to form our dependence relation $(1)$. A set $S'$ of vectors is linearly dependent if there is some non-empty subset $\{v_1,...,v_k\}$ of $S'$ and some set of non-zero scalars $\{b_1,...,b_k\}$ such that $$b_1v_1+\cdots+b_kv_k=0.$$ We may well do this with $k=1$--for example, if the zero vector is in $S'$, we can always do this, and that's all that was meant. The set containing the zero vector could be infinite, and still be linearly dependent--the fact that the zero vector is in the set is enough for linear dependence all by itself.

Related Question