[Math] let $V$ and $W$ be finite-dimensional vector spaces having ordered bases, and let $%T:V W$ be linear. Then for each

linear algebralinear-transformations

There's a theorem proved in my book that I keep having to refer to, but I do not understand what it's saying at all. I tried talking to my professor, and got some tutoring, but both were a waste of time. Can anyone maybe shed light on it?

Theorem 2.14 (Linear Algebra, 4th Ed, Friedman)
Let $V$ and $W$ be finite-dimensional vector spaces having ordered bases $\beta$
and $\gamma$, respectively, and let $T:V\rightarrow W$ be linear. Then
for each $u \in V$ we have $$[T(u)]_\gamma =
> [T]_\gamma^\beta[u]_\beta$$

Proof. Fix $u \in V$, and define the linear transformations $f: F\rightarrow V$ by $f(a)=au$ and $g:F\rightarrow W$ by $g(a)=aT(u)$
for all $a \in F$. Let $\alpha = \{1\}$ be the standard ordered basis
for $F$. Notice that $g = Tf$. Identifying column vectors as matrices
and using Theorem 2.11, we obtain $$[T(u)]_\gamma = [g(1)]_\gamma =
> [g]_\alpha^\gamma=[Tf]_\alpha^\gamma =
> [T]_\beta^\gamma[f]_\alpha^\beta = [T]_\beta^\gamma
> [f(1)]_\beta=[T]_\beta^\gamma[u]_\beta$$

What is that even saying? I thought I was getting the idea of the little "ordered bases" notations after $[T]$ but apparently not? I understand the notation $[T]_\beta^\gamma$ is the transformation of a function in the basis $\beta$ to the basis $\gamma$… that's about all though I guess.

Best Answer

The theorem says that you can use $[T]_\gamma^\beta$ in precisely the way we should expect. If $[u]_\beta$ is the coordinate vector of $u$ with respect to $\beta$, then the coordinate vector of the output , that is $[T(u)]_\gamma$, is precisely what you get from the matrix product $[T]_\beta^\gamma [u]_\beta$.

In other words, the matrix $[T]_\beta^\gamma$, when left-multiplied to a vector, "computes" the output of the transformation $T$. This is precisely why care about "the matrix of a linear transformation" in the first place.

Related Question