[Math] Why doesn’t the definition of dependence require that one can expresses each vector in terms of the others

intuitionlinear algebralinear-transformations

I was reviewing my foundations on linear algebra and realized that I am confused about independence and dependence. I understand that by definition independence means:

A set of vectors $\{x_1,\ldots,x_k\}$ is independent if the only linear combination that gives the zero vector is the zero vector itself. i.e. if $[x_1, \ldots, x_k]c = Xc = 0$ iff $c=0$

I understand what the definition says but it sort of goes against my intuition of what the definition of dependence should be (and hence its negation independence). In my head intuitively dependence means that the a set of vectors depends on each other. In other word one should always be able to express one vector as a linear combination of the others. Something like:

$$ \forall x_i \in \{x_1,\ldots,x_k\}, \exists c \neq 0 : \sum_{j \neq i} c_j x_j = x_i$$

however with my definition above (which is wrong and is not what the standard definition is, I know but I am trying to come to terms why its wrong) implies that a set of independent vectors with the zero vector tacked on is not dependent (i.e. independent) which is the opposite of what is should be. i.e. tacking the zero vector and the set remains independent (this should be wrong cuz [0,…,0,1] is not the zero vector and only the zero vector should give 0).

Consider for a simple example $ \{ x_1,x_2,0 \}$ where $x_1,x_1$ only give zero with the zero vector (standard definition of independence). With my definition of things its obvious that these vectors are independent. In reality they should be dependent because [0,0,1] is now in the nullspace but things are only independent if only the zero vector is in the nullspace. With my definition the vectors are independent because there is no way to express any of them in terms of each other. For example:

  1. $a x_1 + b x_2 = 0$
  2. $c x_1 + d 0 = x_2$
  3. $e x_2 + f 0 = x_1$

non of the above can be made true with non zero (non-trivial) linear combinations. Thus, the vectors are not dependent so they are independent. I know its sort of an "edge case" condition for the definition but it sort of flipped my world to find out that I've been thinking about such a fundamental concept like independence and dependence wrongly in linear algebra and I'm trying to come to terms with it.

Why is my intuition incorrect? Why was the standard definition of independence as $Xc = 0 \iff c=0$ the accepted definition of independence? Whats wrong with my definition? Are they essentially the same definition except for this weird edge case?


last footnote is about what the word dependence means with respect to the number and vector zero. I think what my last confusion boils down to is why $0x = \mathbf{0}$ is considered as $\mathbf{0}$ depending on $x$. I guess in my head saying that we don't need any of $x$ to express $\mathbf{0}$ seems to mean that $\mathbf{0}$ doesn't need $x$ (or any other vector). But the convention according to everything pointed out by everyone in these set of answers points out to the opposite. I don't understand why. Is it that just having an equation linking terms means dependence even if we specify with a zero that we don't actually need the term?

Best Answer

Your intuition for linear (in)dependence is very close. Based on your intuition, the definition you're looking for is:

$\{v_1, ..., v_k\}$ is linearly dependent if there exists an index $i$ and scalars $c_1, ..., c_k$ (excluding $c_i$) such that $v_i = \sum_{j \ne i} c_j v_j.$

You can prove that this is equivalent to the standard definition.

Notice how this differs from your proposed definition:

(1) It says there exists a $v_i$, not for all $v_i$.

(2) There is no zero restriction on the $c_i$.

(1) is important because all it takes is a single redundancy to get linear dependence. Not all vectors have to expressible in terms of the others. To see why this is the case, just think about the case where a set $\{v_1, \ldots, v_k\}$ is already dependent and then I suddenly add a $v_{k+1}$ which cannot be expressed as a linear combination of $v_1, \ldots, v_k$. Adding a vector to a dependent set shouldn't turn it into an independent set.

As for (2), the standard definition needs to say that $c$'s can't be all 0 because you don't want $\sum 0 v_i = 0$ to imply dependence. But with the above definition, you've already singled out a vector to have a coefficient of 1 (which is not 0) so you don't need any condition on the c's anymore.