Example for linear transformations as vector spaces

functional-analysislinear algebralinear-transformations

I am currently studying a book on Functional Analysis by George Bachman and Lawrence Narici. In one of the first chapters, they describe how a (linear) mapping $A$ can map from a vector space $X$ to a different vector space $Y$:

$$A:X\rightarrow Y$$

which has the (linear) properties that for any scalars $\alpha$ and $\beta$ and any vectors $x$ and $y$, we have:

$$A(\alpha x + \beta y) = \alpha A(x)+\beta A(y)$$

Now the critical part: They state that "the collection of all linear transformations mapping $X$ into $Y$ can be viewed as a vector space by defining addition of the linear transformations $A$ and $B$ to be that transformation which takes $x$ inton $Ax+Bx$; symbollically, we have:
$$(A+B)x=Ax+Bx$$
and for scalar multiplication:"
$$(\alpha A)x=\alpha A x$$

where they omit the parantheses around $(x)$ for brevity. I see how these operations above imply linearity, but how can the collection of these transformations be interpreted as a vector space? As far as I have gathered, a vector space should have basis vectors. What would these basis vectors be in the case of $A$?

I have an unfinished example below, which you could use to explain your answer. Otherwise feel free to use your own examples, if this is easier.

Example: $x \in X \in \mathbb{R}^2$ and $y \in Y \in \mathbb{R}^3$. In this case $A$ in $y=Ax$ would be a $3 \times 2$ matrix. What would be the basis vectors of the vector space interpretation of $A$?

Best Answer

Regarding your example: Any $3 \times 2$ matrix can be written as a linear combination of $\{E_{ij} \}$, where $E_{ij}$ is $0$ everywhere except in position $(i,j)$.

A better way of looking at vector spaces is to concentrate on their properties from an algebraic point of view: the essential property is linearity, the fact that all elements in the vector space can be multiplied by a scalar and added together to form new elements in the vector space. A function between vector spaces is linear if it preserves this essential property: that the image of a linear combination is again a linear combination is very simple way.

All vector spaces have a basis, and that is why linear functions can be expressed in terms of matrix multiplication.