[Math] Invertible Matrix Theorem Proof

inverselinear algebramatricesproof-writing

I'm trying to prove the Invertible Matrix Theorem. The theorem that was given is as follows:

For any $n\space x\space n$ matrix $A$, the following are equivalent:
1) $A$ is invertible
2) The RREF of $A$ is $I$
3) rank $A = n$
4) The system of equations $A\vec{x} = \vec{b}$ is consistent with a unique solution for all $\vec{b}\in\mathbb{R}^n$
5) The nullspace of $A$ is $\left \{\vec{0}\right \}$
6) The columns of $A$ form a basis for $\mathbb{R}^n$
7) The rows of $A$ form a basis for $\mathbb{R}^n$
8) $A^T$ is invertible

I'm fine with everything up until I have to prove that 6 implies 7. My prof said that we should be able to prove all (or most?) of these through theorems and/or definitions. I've look around for the proof of 6 implying 7 and a lot of them refer to the column rank being equal to the row rank which I don't think was covered in my class yet. Any ideas on how to prove this?

Best Answer

Assume toward a contradiction that the last row is a linear combination of the other rows. Now consider the first $n-1$ entries of the $n$ columns. These are $n$ vectors of size $n-1$, thus one is a combination of the others. This combination automatically extends to the last entry by the assumption. Thus we have a contradiction to independence of the columns.

Related Question