[Math] Definition of basis in infinite-dimensional vector space

linear algebra

I am struggling to understand the definition of a basis in an infinite dimensional vector space. Specifically, the definition I know says: A subset $B$ of a vector space $V$ is a basis for $V$ if every element of $V$ can be written in a unique way as a finite linear combination of elements from $B$.

However, for any non-empty subset $X$ of a vector space $V$, the zero element of the space can be written in more than one way as a finite linear combination of elements from $X$. For example, $0 = 0v = 0w$, where $v \neq w$ are from $X$. So therefore, no subset $X$ of a vector space $V$ could be a basis for $V$.

What am I missing? What exactly does the definition mean?

Best Answer

$\newcommand{\Reals}{\mathbf{R}}$Let $(V, +, \cdot)$ be a real vector space. One commonly says an ordered set $S = (v_{i})_{i \in I}$ indexed by a set $I$ is a basis of $V$ if the following hold:

  • $S$ spans $V$, i.e., for every $v$ in $V$, there exists a function $c:I \to \Reals$, non-zero for only finitely many $i$, such that $v = \sum_{i} c(i) v_{i}$.

  • $S$ is linearly independent, i.e., if $c:I \to \Reals$ is a function, non-zero for only finitely many $i$, and if $0 = \sum_{i} c(i) v_{i}$, then $c(i) = 0$ for all $i$.

In this framework, "uniqueness of representation" is a (simple) theorem: If $v \in V$, there exists a unique function $c:I \to \Reals$, non-zero for at most finitely many $i$, such that $v = \sum_{i} c(i) v_{i}$.

The point is, you can (and for uniqueness, should) view a linear combination from $S$ as a sum over all elements of $S$, but in such a way that at most finitely summands have a non-zero coefficient.