[Math] Show that a vector can be expressed as a linear combination of the vectors that form a basis for its vector space in exactly one way

hamel-basislinear algebravector-spaces

Show that if S = {$v_1$, … , $v_n$} is a basis for a vector space V
then each vector v $\in$ V can be expressed as v = $k_1v_1$ + $k_2v_2$
+ … + $k_nv_n$ (where $k_i \in R$ for i = 1, … , n) in exactly one way.

Since the vectors in S form a basis, $v_1$, … , $v_n$ are all independent and span the vector space.

This means that for any vector v $\in$ V, v can be reached with a linear combination of $v_1$, … , $v_n$.

So v = $k_1v_1$ + $k_2v_2$ + … + $k_nv_n$.

However, I do not know how to prove it can be expressed in exactly one way. Geometrically speaking, I understand that every $v_i$ points in a different direction, so to reach a new vector there is only one linear combination of the basis vectors that will reach it.

I think that it could be shown that if you assume that there are multiple linear combinations to reach a vector then it implies that $v_1$, … , $v_n$ are not all linearly independent, but I am not sure how to do this.

Best Answer

You're on the right track. Here's how to start implementing your idea:

Suppose you have two ways to write $v$ as a linear combination of basis elements, say, as both \begin{align*} v &= k_1 v_1 + \cdots + k_n v_n \\ v &= \ell_1 v_1 + \cdots + \ell_n v_n . \end{align*} Then, subtracting the two equations gives $$0 = (\ell_1 - k_1) v_1 + \cdots + (\ell_n - k_n) v_n .$$ Now, what does linear independence tell us about a linear combination of basis elements that is equal to zero?