[Math] Proving linear independence of a basis from coordinate vectors

linear algebraproof-verificationproof-writingvector-spaces

Let $B$ be a basis for $\mathbb{R}^n$. Prove that the vectors $v_1, v_2, \dots, v_k $ form a linearly independent set if and only if the vectors $[v_1]_B, [v_2]_B, \dots, [v_k]_B$ form a linearly independent set.

I know that the coordinate vectors of $v_1, \dots v_k $ relative to $B$ will be produce zero vectors since only the trivial solution exists if they are linearly independent which implies that a certain vector $v \in B$ cannot be written as a combination of the other vectors implying linear independence of the basis.

My issue is writing this out concretely rather than using intuition (if its correct).

A secondary question is the assertion that is the coordinate vectors span $\mathbb{R}^n$ implies that the basis spans $\mathbb{R}^n$.

I am not sure how to go about proving this assertion

Best Answer

Suppose the vectors $v_1,\dots,v_k$ are linearly dependent. Then there exist non-trivial (not all zero) coefficients $a_1,\dots,a_k$ so that $$ a_1v_1+\dots+a_kv_k=0. $$ Now express this in the basis $B$: $$ 0 = [a_1v_1+\dots+a_kv_k]_B = a_1[v_1]_B+\dots+a_k[v_k]_B. $$ Therefore the vectors $[v_1]_B,\dots,[v_k]_B$ are linearly dependent.

If you assume that the vectors $[v_1]_B,\dots,[v_k]_B$ are linearly dependent, you can follow the same steps backwards to show that $v_1,\dots,v_k$ are linearly dependent.

We have shown that one set of vectors is linearly dependent if and only if the other one is. Therefore one set of vectors is linearly independent if and only if the other one is.

Related Question