Why is the span of the empty set defined to be $\{0\}$? It is known that the span of any nonempty set of vectors in a vector space $V$, gives a subspace of $V$, and it is stated in “Linear Algebra Done Right” by Axler, that to be consistent with this, the span of the empty set is defined to be $\{0\}$? Is this the only reason or does this definition prove useful in other ways later on?
[Math] Linear span of the empty set
linear algebra
Related Solutions
Remember that a subspace by definition is closed with respect to vector addition. That means that every subspace which contains $S$ necessarily contains every linear combination of elements of $S$. In turn then, the intersection of all such subspaces is exactly the set of all linear combinations of vectors in $S$.
One way to think about a linearly independent list of vectors is:
A list of vectors is linearly independent if and only if removing any vector from the list will result in a list whose span is strictly smaller than that of the original list.
Intuitively, the list is minimal for its span: remove any vector, you get a strictly smaller span. Intuitively, the list doesn't have any (linear redundancies).
Another, more intrinsic way of thinking about linearly independent lists is:
A list of vectors is linearly independent if and only if no vector in the list is a linear combination of the other vectors in the list.
One way to think about a spanning set for a vector space is:
A list of vectors in $V$ is a spanning set if every vector of $V$ is in the span of the list.
Intuitively, the list is "sufficient" to get you all vectors in $V$ (via linear combinations).
Note that "linearly independent" is intrinsic: it depends on the vectors (and the vector space operations), and only on them. Whereas "spanning set" is extrinsic: whether a set of vectors spans depends on which vector space you are working on. (E.g., $\{1,x,x^2\}$ is a spanning set for the vector space of real polynomials of degree at most $2$, but not for the vector space of all real polynomials.)
What the Lemma says is that spanning sets have to at least as large as linearly independent sets.
This is not trivial, and in fact turns on the fact that your scalars come from a field. To see that this assertion is not trivial, imagine that instead of a vector space where you can multiply by any element of the field, we will only take linear combinations with integer coefficients, and consider the "vector space" (in fact, it's called a module, or a $\mathbb{Z}$-module) of all integers. Here, the list consisting of $2$ and $3$ is minimal: you can get any integer with an (integral) linear combination of $2$ and $3$; but if you drop either of them, you can't get them all. You will get either just multiples of $2$, or just multiples of $3$.
On the other hand, the set consisting only of $1$ is a spanning set: every integer is an integer linear combination of $1$. So in this situation, we have a "linearly independent" set (a minimal set with respect to linear combinations) that has more elements than a spanning set.
(Caveat: There are multiple ways of defining "linearly independent", which are equivalent in a vector space; in this setting, they wouldn't be. For example, the definition that says that a list of vectors $v_1,\ldots,v_k$ is linearly independent if and only if whenever we have a linear combination equal to $0$, $\alpha_1 v_1+\cdots + \alpha_k v_k = \mathbf{0}$, all scalars must be zero: $\alpha_1=\cdots=\alpha_k=0$. Under this definition, the list I gave would not be "integrally linearly independent" because we can get $0$ as $3(2) -2(3) = 0$.)
In a vector space, the most basic relationship between linearly independent sets and spanning sets is that of the Lemma. In fact, the lemma can be refined to say that every linearly independent set can be extended to a set that is both linearly independent and spans; and every spanning set contains a spanning set that is linearly independent. From this you will show that any two linearly independent spanning sets for the same vector space have the same number of elements. That number is called the "dimension" of the vector space, and it is an invariant of fundamental importance in Linear Algebra.
Best Answer
If you want to stay coherent, you almost never have a choice for such "empty" definitions.
Here, the span of $X$ is the set of linear combinations $\sum_{x\in X} \lambda_x x$. So the question boils down to what is an empty sum. It has to be $0$, because when you add an empty sum to $s$, you want to get $s$. An empty operation is always the neutral element for this operation, like an empty product is $1$.
So here, $Span(\emptyset)$ is the set of all possible empty sums, which is $\{0\}$. It is also a good remark that this is coherent with the fact that for any set $S$, $Span(S)$ is a vector space.