Linear Algebra – Proof of Linearly Independent Columns in Invertible Matrix

linear algebramatricesproof-verification

Explain why the columns of an $n \times n$ matrix $A$ are linearly
independent when $A$ is invertible.

The proof that I thought of was:

If $A$ is invertible, then $A \sim I$ ($A$ is row equivalent to the identity matrix). Therefore, $A$ has $n$ pivots, one in each column, which means that the columns of $A$ are linearly independent.

The proof that was provided was:

Suppose $A$ is invertible. Therefore the equation $Ax = 0$ has only one solution, namely, the zero solution. This means that the columns of $A$ are linearly independent.

I am not sure whether or not my proof is correct. If it is, would there be a reason to prefer one proof over the other?


As seen in the Wikipedia article and in Linear Algebra and Its Applications, $\sim$ indicates row equivalence between matrices.

Best Answer

I would say that the textbook's proof is better because it proves what needs to be proven without using facts about row-operations along the way. To see that this is the case, it may help to write out all of the definitions at work here, and all the facts that get used along the way.


Definitions:

  • $A$ is invertible if there exists a matrix $A^{-1}$ such that $AA^{-1} = A^{-1}A = I$
  • The vectors $v_1,\dots,v_n$ are linearly independent if the only solution to $x_1v_1 + \cdots + x_n v_n = 0$ (with $x_i \in \Bbb R$) is $x_1 = \cdots = x_n = 0$.

Textbook Proof:

Fact: With $v_1,\dots,v_n$ referring to the columns of $A$, the equation $x_1v_1 + \cdots + x_n v_n = 0$ can be rewritten as $Ax = 0$. (This is true by definition of matrix multiplication)

Now, suppose that $A$ is invertible. We want to show that the only solution to $Ax = 0$ is $x = 0$ (and by the above fact, we'll have proven the statement).

Multiplying both sides by $A^{-1}$ gives us $$ Ax = 0 \implies A^{-1}Ax = A^{-1}0 \implies x = 0 $$ So, we may indeed state that the only $x$ with $Ax = 0$ is the vector $x = 0$.


Your Proof:

Fact: With $v_1,\dots,v_n$ referring to the columns of $A$, the equation $x_1v_1 + \cdots + x_n v_n = 0$ can be rewritten as $Ax = 0$. (This is true by definition of matrix multiplication)

Fact: If $A$ is invertible, then $A$ is row-equivalent to the identity matrix.

Fact: If $R$ is the row-reduced version of $A$, then $R$ and $A$ have the same nullspace. That is, $Rx = 0$ and $Ax = 0$ have the same solutions

From the above facts, we conclude that if $A$ is invertible, then $A$ is row-equivalent to $I$. Since the columns of $I$ are linearly independent, the columns of $A$ must be linearly independent.