Linear Algebra – Prove $n+1$ Vectors in $\mathbb{R}^n$ Cannot Be Linearly Independent

linear algebra

I was looking for a short snazzy proof on the following statement:

n+1 vectors in $\mathbb{R}^n$ cannot be linearly independent

A student of mine asked this today morning and I couldn't come up with a proof solely from the definition of linear independence.

From a higher level perspective, I explained that if I put the vectors in a matrix then if the only null space entry is the zero vector, then the vectors are independent but since we have one extra column than row and that the row and column rank are equal, there is no way we can have $n+1$ as the rank of the matrix and hence from Rank-Nullity theorem, the dimension of the Nullspace is at least one which implies that there is a combination of the vectors where not all the scalar multiples in the definition are 0 but yet we get a zero as the linear combination. The student hasn't completely learnt the fundamental subspaces yet so I am not sure he grasped what I was saying.

Is there a cleaner proof?

EDIT: I am stunned how many beautiful answers I got with so much diversity.

Best Answer

Let denote $(e_1,\ldots,e_n)$ the standard basis of $\mathbb R^n$ and suppose that $(f_1,\ldots,f_{n+1})$ a set of linearly independent vectors. We can write $$f_1=\sum_{k=1}^n a_k e_k$$ and since $f_1\ne 0$ there's $a_k\ne 0$ and WLOG suppose $a_1\neq0$ so $$e_1=\frac{1}{a_1}\left(f_1-\sum_{k=2}^n a_k e_k\right)$$ hence we see that $(f_1,e_2,\ldots,e_n)$ spans $\mathbb R^n$.

Now repeat $n$ times the same method (induction) and we find that $(f_1,\ldots,f_{n})$ spans $\mathbb R^n$ so the vectors $f_{n+1}$ is a linear combination of the other vectors $f_i$ which is a contradiction.

If the induction step was not obvious, consider the following:

Since we have established that $(f_1,e_2,\ldots,e_n)$ spans $\mathbb R^n$, write

$$f_2=b_1f_1+\sum_{k=2}^n b_k e_k$$

Since $f_2\neq0$, there's at least one $b_l\neq0$. Also note that not all $(b_2,b_3,\ldots,b_n)$ can be zero because otherwise it would imply that $f_2=b_1f_1$ contradicting the assumed linear independence of $(f_1,\ldots,f_{n+1})$. So we can take $b_l\neq0$ where $l\geq 2$. WLOG suppose $b_3\neq0$. Now,

$$e_3=\frac{1}{b_3}\left(f_2-b_1f_1-b_2e_2-\sum_{k=4}^n b_k e_k\right)$$

From this we see that, $(f_1,e_2,f_2,e_4,\ldots,e_n)$ spans $\mathbb R^n$, where we replaced $e_3$ with $f_2$. The assumed linear independence of the $f_i$'s means that we can repeat this process to replaces all the $e_i$'s.