[Math] Showing that every subspace $S\subset \mathbb{R}^n$ has a basis

linear algebra

We will show that every subspace $S\subset \mathbb{R}^n$ has a basis. We note that the empty set $\{\}$ is a basis for the zero subspace $\left\{0\right\}$, since a linear combination (sum) of an empty set of vectors is defined to be the zero vector (similarly, the empty product is defined to be $1$).
1. Let $B_0 := \left\{\right\}$. Show that if $S\neq \textit{span}(B_0)$, then there is a vector $v_1\in S \, \backslash \, \text{span}(B_0)$, for which $B_1 := \left\{v_1\right\}$ is a linearly independent subset of $S$.
2. Show that if $S\neq \text{span}(B_1)$, then there is a linearly independent $B_2 = \left\{v_1,v_2\right\}$ of $S$.
3. Continuing in this way, show there must be an $m$ for which $B_m = \left\{v_1,v_2,\ldots,v_m\right\}$ is a linearly independent subset of $S$ which spans $S$, i.e., a basis for $S$.

Any help would be appreciated!!!

Best Answer

You may use the assumption about $\ S\,\ $ being a subspace of $\ \mathbb R^A,\ $ where $\ A\ $ is a finite set (and $\ \mathbb R^A\ $ is the set of all functions $\ x:A\rightarrow\mathbb R$).

It's not a place here to say a lot but nevertheless, the construction below has several important implications which go beyond the scope of the given Q.

=====================================

Case $\ A=\emptyset\ $ is trivial. Let $\ A\ne\emptyset.\ $

Define (construct by induction), a decreasing sequence of linear subspaces

$$ S=S_1 \supseteq S_2 \supseteq S_3 \supseteq \ldots $$ and two sequences of elements $\ b_k\in S_k\ \mbox{and}\ a_k\in A\ (k=1\ 2\ \ldots)\ $

-- both sequences being a priori finite or infinite -- as follows, subject to the following four conditions:

  • if $\ S_n=\ \{0\}\ $ the construction stops;
  • if $\ S_n\ne\ \{0\}\ $ then let $\ b_n\in S_n\setminus\{0\}\ $ be arbitrary; and then
  • $\ a_n\in A\setminus \{a_k\in\ A: k<n\} $
  • $\ S_{n+1}\ :=\ \{\,x\in S_n:\ \forall_{k\in 1..n}\ x(a_k)=0\,\} $

The above sequences are finite since $\ A\ $ was assumed to be finite -- at one moment we will have $\ S_{n+1}=\{0\}.\ $ Then

$$ \{b_1\ \ldots\ b_n\} $$

is the base of $S$. Obviously, that last non-zero $\ n\le|A|,\ $ i.e.

$$ \dim S\ \le\ \dim(\mathbb R^A)$$

=====================================

REMARK 1. The theorem about base holds for arbitrary linear spaces (over arbitrary fields), including the infinite-dimensional spaces -- you don't even need to talk about subspaces (every linear subspace is a linear space, that's all that you need to know).

However, the above proof for the finite-dimensional case (which holds for arbitrary fields, of course) is extra useful in linear algebra.

REMARK 2. In general, when you include the infinite-dimensional spaces, the theorem about base requires transcendental induction or any other property of sets which is equivalent to the axiom of choice. The axiom of choice is necessary, meaning that a complete ignoring the axiom of choice will not lead to the theorem about the base.

Related Question