The ring of linear maps i isomorphism to the ring of matrices (finite dimesional case)

linear algebra

I am willing to prove the following.

Let $V$ be an $n$-dimensional vector space over a field $\mathbb{K}$.

Denote $L(V)$ = the ring of linear maps from $V \rightarrow V$.

$M_n \left( \mathbb{K}\right)$ = the matrices ring of $n \times n$ matrices.

Prove that $L(V) \cong M_n \left( \mathbb{K} \right)$.

I did as the following.

Let $\left \{e_1, e_2, \dots , e_n \right \}$ be an orthonormal basis of $V$.

Assume $f \in L (V)$ then

$f(e_1) = \beta _1$

$f( e_2) = \beta_2$

$\vdots$

$f(e_n) = \beta _n.$

$\forall x \in V, x= a_1 e_1 + \dots a_n e_n$, we have

$f(x) = a_1 \beta_1 + \dots + a_m \beta_n$.

Suppose that

$\beta_1 = b_1^1 e_1 + b_1^2 e_2 + \dots b_1^ne_n$

$\beta_2 = b_2^1 e_1 + b_2^2 e_2 + \dots + b_n^2 e_n$

$\vdots$

$\beta _n = b_n^1 e_1 + b_n ^2 e_2 + \dots b_n ^n e_n$.

Then

$f(x) = a_1 \left( b_1^1 e_1 + b_1 ^2 e_2 + \dots b_1^n e_n \right) + \dots + a_n \left( b_n^1 e_1 + b_n^2 e_2 + \dots + b_n^n e_n \right)$.

I stopped here since I have no idea to construct the corresponding matrix.

Could you please give me a hint?

Thank you in advance.

Best Answer

Let $V$ be a finite-dimensional vector space with basis $\beta=\left\{v_1, \dots, v_n\right\}$.

We construct a map $\phi:M_n(\mathbb{K})\rightarrow L(V)$ by mapping a matrix $A$ to the linear transformation $L_A:V\rightarrow V$ defined by $L_A(v)=AX$ where $X$ is given by $\begin{pmatrix} \lambda_1\\ \vdots \\ \lambda_n \end{pmatrix}$ where $v=\sum_{i=1}^n\lambda_iv_i$.

Conversely, we construct a map $\psi:L(V)\rightarrow M_n(\mathbb{K})$ as follows: Let $f\in L(V)$, then for each $i$ there exist $\mu_{j,i}\in \mathbb{K}$ such that $$f(v_i)=\sum_{i=1}^n\mu_{j,i}v_j.$$ In this way we obtain a matrix $f_{\beta}^{\beta}=(\mu_{j,i})_{1\leq i,j\leq n}^T$. Hence the map $\psi$ is given by $\psi(f)=f^{\beta}_{\beta}$.

It remains to show that these two maps are inverses of each other.

We first show that $(\psi\circ\phi)(A)=A$. Notice that $L_A(v_i)=Ae_i$. Here $e_i=\begin{pmatrix} 0\\ \vdots\\ 1\\ \vdots\\ 0 \end{pmatrix}$, the column vector with zeroes everywhere except at the $i$-th position where there is a $1$. Now $Ae_i=\begin{pmatrix} a_{1i}\\ a_{2i}\\ \vdots\\ a_{ni} \end{pmatrix}$ is simply the $i$-th column of $A$. It follows that $\mu_{j,i}=a_{j,i}$. Hence $f^{\beta}_{\beta}=(\mu_{j,i})_{i,j}=(a_{j,i})_{i,j}^T=(a_{i,j})_{i,j}=A$. This shows that $\psi\circ \phi=Id_{M_n(\mathbb{K})}$.

Can you show $\psi\circ \phi=Id_{L(V)}$ as well?

Once you figured this out you should do the more general case as well: Let $V$ and $W$ be finite-dimensional vector spaces with dimension $n$ and $m$ respectively. Then $L(V,W)\cong M_{m\times n}(\mathbb{K})$. A very important remark is that the isomorphism depends on a choice of bases in both $V$ and $W$. Hence, after choosing bases $\alpha$ and $\beta$ of $V$ and $W$ respectively, any linear map $f:V\rightarrow W$ corresponds uniquely to a $m\times n$ matrix with entries in $\mathbb{K}$.

Understanding the above correspondence is the magical key to understanding everything of basic finite-dimensional linear algebra. All further topics such as eigenvectors, eigenbases, diagonalization and so on become easy once you truly understand this correspondence.

Related Question