Prove that n-1 vectors cannot span a n dimensional vector space

linear algebravector-spaces

I want to prove that n-1 vectors in a n-dimensional space (say $\mathbb{R}^n$) cannot span the n-dimensional space. How can I do this?

One way is to find a vector that cannot be obtained as a linear combinations of the $n-1$ vectors. But how can I find this vector? (I cannot use concept of fundamental subspaces)

Another way is to assume that $n-1$ vectors span $\mathbb{R}^n$ and arrive at a contradiction. What contradiction can I find?

I'm allowed to use only definitions of vector space, subspace, linear independence and span of a set of vectors. I'm not allowed to use the concepts of basis, fundamental subspaces, etc. Is it even possible to prove this?

Best Answer

Suppose for contradiction:
the following have the same span, with standard basis vectors $\mathbf e_k$
$\mathbf {B} :=\bigg[\begin{array}{c|c|c|c|c} \mathbf e_1 & \mathbf e_2 &\cdots & \mathbf e_{n-1} & \mathbf e_{n}\end{array}\bigg]$ and $\mathbf {B}' :=\bigg[\begin{array}{c|c|c|c|c} \mathbf v_1 & \mathbf v_2 &\cdots & \mathbf v_{n-1}\end{array}\bigg]$

Since the span is the same, each column in $\mathbf B$ may be written as a linear combination of columns of $\mathbf B'$. That is,

$\mathbf B = \mathbf B' A$
where $A$ is short ($n-1$ rows) and fat ($n$ columns) but
$\mathbf B\mathbf y = \mathbf 0\implies \mathbf y = \mathbf 0$ by linear independence of standard basis vectors, yet
$A\mathbf y =\mathbf 0$ for some $\mathbf y \neq \mathbf 0$ which is a contradiction.

How do you know there is a non-zero $\mathbf y$ in the kernel of $A$? Suppose its first $n-1$ columns are linearly dependent-- then use the definition of linear dependence to get $\mathbf y$. Alternatively suppose its first $n-1$ columns are linearly independent, then use row reduction or Cramer's Rule on that $n-1 \times n-1$ matrix to solve $A_{n-1\times n-1}\mathbf x = -\mathbf a_n$ and set $y_i:= x_i$ for $1\leq i \leq n-1$ and set $y_n:=1$.