Every vector is a linear combination of $n$ linearly independent vectors

linear algebravectors

In $\mathbb R^n$, let $\mathbf{v}_{1}$, $\mathbf{v}_{2}$, $\dots$, and $\mathbf{v}_{n}$ be linearly independent vectors (i.e. if $\alpha_{1}\mathbf{v}_{1}+\alpha_{2}\mathbf{v}_{2}+\dots+\alpha_{n}\mathbf{v}_{n}=\mathbf{0}$ and $\mathbf{v}_{1}$, $\mathbf{v}_{2}$, $\dots$, and $\mathbf{v}_{n}$ are distinct, then $\alpha_{1},\alpha_{2},\dots,\alpha_{n}=0$). Let $\mathbf{u}$ be a vector.

Show that $\mathbf{u}$ is a linear combination of $\mathbf{v}_{1}$, $\mathbf{v}_{2}$, $\dots$, and $\mathbf{v}_{n}$ (i.e. $\mathbf{u}=\beta_{1}\mathbf{v}_{1}+\beta_{2}\mathbf{v}_{2}+\dots+\beta_{n}\mathbf{v}_{n}$ for some $\beta_{1}, \beta_{2},\dots,\beta_{n}\in \mathbb R$).

(We have already proven that if $\mathbf{v}_{1}$, $\mathbf{v}_{2}$, $\dots$, and $\mathbf{v}_{n}$ are linearly independent, then each vector cannot be written as a linear combination of the other vectors.)


Context: High school

Goal (not sure if this is possible): Prove using only rules of arithmetic with real numbers and vectors (e.g. $(1,2)+(3,4)=(4,6)$, $3(1,2)=(3,6)$, $-(1,2)=(-1,-2)$) and without introducing such additional concepts as span, basis, dimension, vector space, matrix, row reduction, inverse of matrix, etc.

Best Answer

Here's an induction proof on $n$.

If $v_1$ is linearly independent on $\Bbb{R}$, then $v_1 \neq 0$ (otherwise $1v_1 = 0$, where $1 \neq 0$). We can then write any $x \in \Bbb{R}$ as: $$x = \frac{x}{v_1} v_1.$$ This establishes the base case.

Now, suppose that, for any $x \in \Bbb{R}^n$ and any linearly independent $w_1, \ldots, w_n \in \Bbb{R}^n$, there exist $b_1, \ldots, b_n$ such that $x = b_1w_1 + \ldots + b_n w_n$.

Consider a linearly independent list of vectors $v_1, \ldots, v_{n+1} \in \Bbb{R}^{n+1}$. Note, if it is the case that every one of the standard basis vectors $e_1, \ldots, e_{n+1}$ can be formed as a linear combination of $v_1, \ldots, v_{n+1}$, then we are done, since linear combinations of linear combinations are linear combinations.

Otherwise, let us suppose $e_i$ cannot be formed as a linear combination of $v_1, \ldots, v_{n+1}$. Let: $$w_k = v_k - (v_k \cdot e_i) e_i,$$ where $\cdot$ is the dot product, and hene $w_k$ is simply $v_k$ with its $i$th entry set to $0$. I claim that $w_1, \ldots, w_{n+1}$ are linearly independent.

Suppose $a_1, \ldots, a_{n+1}$ are such that $$0 = \sum_{k=1}^{n+1} a_k(v_k - (v_k \cdot e_i)e_i) = \sum_{k=1}^{n+1} a_kv_k - \left(\sum_{k=1}^{n+1}v_k \cdot e_i\right)e_i.$$ Note that, if $\sum_{k=1}^{n+1}v_k \cdot e_i = 0$, then $\sum_{k=1}^{n+1} a_kv_k = 0$, so by linear independence of $v_1, \ldots, v_{n+1}$, we have $a_1 = \ldots = a_{n+1} = 0$, and we are done. Otherwise, we get $$e_i = \sum_{k=1}^{n+1} a_k \left(\sum_{k=1}^{n+1} v_k \cdot e_i\right)^{-1} v_k,$$ which contradicts our assumption about $e_i$. So, the vectors $w_1, \ldots w_{n+1}$ are linearly independent.

But, since all the vectors $w_1, \ldots, w_{n+1}$ have $0$ in their $i$th coordinate, we can simply embed them in $\Bbb{R}^n$, by removing this coordinate. This gives us $n+1$ linearly independent vectors in $\Bbb{R}^n$.

If we just look at the first $n$ of these vectors, which are automatically linearly independent, then the $n+1$th vector is a linear combination of the first the $n$, by the induction hypothesis. This leads to a linear combination of the $n+1$ vectors in $\Bbb{R}^n$, which is a contradiction. Thus, no such standard basis vector $e_i$ exists, and all vectors can be expressed as a linear combination of $v_1, \ldots, v_{n+1}$.