Prove each linear transformation can be written as a matrix

linear algebralinear-transformationsmatricesvector-spaces

I'd like to show that Any linear transformation between two finite-dimensional vector spaces can be represented by a matrix. I've seen a proof for linear transformation from and to $\mathbb{R}^n$ but I want to generalize it to any finite-dimensional vector space.

I'd also like to show that the composition of two linear tranformations can be written as a multipication of two matrices. I believe it should look like this:
$$
[S\circ T]^B_D=[S]^C_D\cdot [T]^B_C
$$

Where $T\rightarrow V:W, S\rightarrow U:V$ and $B,C,D$ are bases of $U,V,W$ respectively

Best Answer

Let the transformation $T$ be from $R^n \to R^m$. We will need bases for each of these spaces, let them be $B_n = \{e_{1n}, e_{2n}... e_{nn}\}$ and $B_m = \{e_{1m}, e_{2m}... e_{mm}\}$ respectively.

Now, any vector $v$ can be expressed as the following

$$v = \sum_1^na_ie_{in}$$

$$\implies T(v) = \sum_i^na_iT(e_{in})$$

To complete the matrix representation, we need to express each $T(e_{in})$ in the basis of the $m$-space

Hence, let $T(e_{in}) = \sum_{k=1}^mb_{ik}e_{km}$

Therefore

$$\implies T(v) = \sum_{i=1}^na_i\sum_{k=1}^mb_{ik}e_{km}$$

Now, we consider the matrix representation of $T$, we express $v$ as a column vector in $R^{n \times 1}$

$$v = \begin{bmatrix}a_1 \\ a_2 \\ . \\. \\. \\a_n\end{bmatrix}$$

Hence, $T(v)$ can be thought of as the sum of $m$ vectors in $R^{m \times 1}$, weighted by the $v$ column scalars. Therefore, we pre-multiply by the column wise representation of $T(e_{in})$ in the basis $B_m$, which is given by scalars $b_{ik}$ as defined above

$$[T] = \begin{bmatrix} b_{11} & b_{21} & b_{31} & ... & b_{n1} \\ b_{12} & b_{22} & b_{32} &...& b_{n2} \\ . & .& . \\ .&.&.&.\\b_{1m} & b_{2m} & b_{3m} &...&b_{nm} \end{bmatrix}$$

Related Question