[Math] Prove that a linear transformation is invertible if and only if its associated matrix is invertible.

linear algebra

Let $V$ be a finite dimensional vector space, $\beta$ an ordered basis of $V$, $T$ a linear operator on $V$ and $A$ the associated matrix of $T$ in the basis $\beta$. I have to prove that $T$ is invertible if and only if $A$ is invertible.

I was thinking that I only need to consider $T^{-1}$ and its associated matrix $A^{-1}$, but I don't know if it's that easy because this problem is supposed to be not too easy. Maybe there's something I'm not considereing, which makes the problem a not too easy problem, but I don't see it. Do you?

Thanks in advance.

Best Answer

The problem in considering $A^{-1}$ is that, at that point, you haven't yet established the fact that $A$ is invertible. You have assumed that $T$ is invertible, so you can take the associated matrix $B$ of $T^{-1}$, but you have to prove that $AB = BA = I$.

Now the actual proof depends a bit on what you already know. It would be nice if you already knew that the associated matrix of the composition of two linear maps is the product of the associated matrices. Then it follows from $T \circ T^{-1} = T^{-1} \circ T = I$ that $AB = BA = I$. (The other direction of the if-and-only-if goes similarly).