[Math] Linear map invertible if and only if associated matrix invertible

linear algebralinear-transformationsmatrices

Theorem: Let $V$ and $W$ be finite dimensional vectorspaces with ordered bases $\beta$ and $\gamma$ resp., and let $T: V \rightarrow W$ be linear. Then $T$ is invertible if and only if the associated matrix $[T]_{\beta}^{\gamma}$ is invertible. Moreover, we have $[T^{-1}]_{\gamma}^{\beta} = ([T]_{\beta}^{\gamma})^{-1}$.

Proof: Let $T$ be invertible. This means that $\dim(V) = \dim(W)$. Let $ n = \dim(V)$, so that $[T]_{\beta}^{\gamma}$ is an $n \times n$-matrix.

I don't really know how to continue, any help please?

Edit: I'd start off like this: $[T^{-1}]_{\gamma}^{\beta} \cdot [T]_{\beta}^{\gamma} = [T^{-1} \circ T]_{\beta} = [Id_V]_{\beta}$. Then I don't know how to proceed, and I think this is wrong because I assumed that the inverse is $[T^{-1}]_{\gamma}^{\beta}$ which is what needs to be proven.

Best Answer

Assume first that $T$ is invertible. As you already wrote, it holds $$ [Id_V]_\beta = [T^{-1}\circ T]_\beta = [T^{-1}]_\gamma^\beta[T]_\beta^\gamma. $$ Now, by definition $[Id_V]_\beta=I_n$ with $n=\dim V$. Then the equation implies that the matrices $[T^{-1}]_\gamma^\beta$ and $[T]_\beta^\gamma$ are invertible matrices and inverse to each other.

To prove the reverse direction, assume that $[T]_\beta^\gamma$ is invertible. Then there is a matrix $B$ such that $$ I_n = B \cdot [T]_\beta^\gamma = [T]_\beta^\gamma \cdot B. $$ Since the mapping $S\mapsto [S]^\beta_\gamma$ is an isomorphism, there is a linear mapping $S$ such that $[S]_\gamma^\beta=B$. Then the equation above implies $S\circ T=Id_V$ and $T\circ S = Id_W$, which is $S^{-1}=T$.