[Math] Is the zero vector in the definition of linear dependence arbritary

definitionlinear algebra

The definition of linear dependence according to wikipedia is

The vectors in a subset $S=(v1,v2,…,vk)$ of a vector space $V$ are said to be linearly dependent, if there exist a finite number of distinct vectors $v1, v2, …, vn$ in $S$ and scalars $a_1$, $a_2$, …, $a_n$, not all zero, such that
$ a_1 v_1 + a_2 v_2 + \cdots + a_k v_k = 0, $
where zero denotes the zero vector.

I was wondering if the zero vector in the definition of linear dependence is arbritary?

Thanks,
Jackson

Best Answer

Well, if it were, then we would have a very curious situation. Try replacing $0$ with some fixed vector $v_0 \ne 0$. Then the set $\{ 0 \}$ is independent, but the set $\{ v_0 \}$ isn't!

To make things worse: if you had two vectors $a$ and $b$ such that $\{ a, b, v_0 \}$ was independent in the standard sense, then the set $\{ \lambda a + \mu b \mid \lambda, \mu \in \mathbb{R} \}$ is independent, despite being a whole subspace!

EDIT: Maybe a specific example will help. Say we define "independent" to mean "there is a linear combination that sums to $\langle 1,1,1 \rangle$. Then the set $\{ \langle 1, 0, 0 \rangle, \langle 0, 1, 0 \rangle, \langle 1, 1, 0 \rangle \}$ is independent (no matter what combination you take, the $z$-component is zero, not one, so you can never get $ \langle 1, 1, 1 \rangle $). But this is clearly silly, because one vector is the sum of the other two, and so whatever our definition is describing, it doesn't capture the notion of independence.

Related Question