[Math] Direct Proof for Statement on Linear Independence and Unique Representations

alternative-prooflinear algebra

The Statement

Show that if a set of vectors is linearly independent, then any vector in the span of that set has a unique representation as a linear combination of these vectors.

My Proof

I'm going for a proof by contrapositive.

Let $S=(v_{1},v_{2},\dots,v_{n})$ be our set of vectors.

Let $x_{0}$ be a vector in the span of S. If it does not have a unique representation, it can be written in atleast two distinct ways:

$x_{0}= a_1v_{1}+a_{2}v_{2}+\dots a_{n}v_{n}$

$x_{0}= b_{1}v_1+b_2v_2+\dots+b_nv_n$

$1)$For atleast one $b_j$ for $1\leq j\leq n$ in those linear combinations it will be the case that $b_j \neq a_j$

$0 = x_0-x_0=(a_1-b_1)v_1+(a_2-b_2)v_2+\cdots+(a_n-b_n)v_n$

By fact 1 that means atleast one of these coefficients is not zero, which means S is linearly dependent.

My Question

I'm pretty happy with my solution(which I think is correct). I was wondering if for learning sake anyone could provide a direct proof(or a proof via another method if a direct is impossible)?

Best Answer

This proof works fine. Here's a "direct" one.

Suppose that $$ \alpha_1 v_1+\dotsb+\alpha_n v_n=\beta_1 v_1+\dotsb+\beta_n v_n $$ Then $$ (\alpha_1-\beta_1)v_1+\dotsb+(\alpha_n-\beta_n) v_n=\vec 0 $$ But $\{v_1,\dotsc,v_n\}$ is linearly independent so $\alpha_j-\beta_j=0$ for each $1\leq j\leq n$. Hence $\alpha_j=\beta_j$ for $1\leq j\leq n$.