Explain why the columns of an $n \times n$ matrix $A$ are linearly
independent when $A$ is invertible.
The proof that I thought of was:
If $A$ is invertible, then $A \sim I$ ($A$ is row equivalent to the identity matrix). Therefore, $A$ has $n$ pivots, one in each column, which means that the columns of $A$ are linearly independent.
The proof that was provided was:
Suppose $A$ is invertible. Therefore the equation $Ax = 0$ has only one solution, namely, the zero solution. This means that the columns of $A$ are linearly independent.
I am not sure whether or not my proof is correct. If it is, would there be a reason to prefer one proof over the other?
As seen in the Wikipedia article and in Linear Algebra and Its Applications, $\sim$ indicates row equivalence between matrices.
Best Answer
I would say that the textbook's proof is better because it proves what needs to be proven without using facts about row-operations along the way. To see that this is the case, it may help to write out all of the definitions at work here, and all the facts that get used along the way.
Definitions:
Textbook Proof:
Now, suppose that $A$ is invertible. We want to show that the only solution to $Ax = 0$ is $x = 0$ (and by the above fact, we'll have proven the statement).
Multiplying both sides by $A^{-1}$ gives us $$ Ax = 0 \implies A^{-1}Ax = A^{-1}0 \implies x = 0 $$ So, we may indeed state that the only $x$ with $Ax = 0$ is the vector $x = 0$.
Your Proof:
From the above facts, we conclude that if $A$ is invertible, then $A$ is row-equivalent to $I$. Since the columns of $I$ are linearly independent, the columns of $A$ must be linearly independent.