Is there a theoretical reason for matrix-vector multiplication

linear algebramatricesvectors

They say matrix multiplication is defined so that it corresponds to compositions of linear transformations. What I would like to know is how this relates to matrix-vector multiplication $Ax$, of how a matrix acts on a vector. For example, we can view $Ax$ as a linear combination of the columns of $A$, and we also get useful notions like the column space and row space.

My question is: Is the way a matrix acts on a vector, which is defined by matrix multiplication, related in any way to the composition of linear transformations, which seems to be the original reason for our definition? Are all the notions and properties related to matrix-vector multiplication just a “definitional byproduct”, or is there some deeper/theoretical reason?

P.S.: I understand that composition of maps versus matrix-vector multiplication are two different things. A composition of maps results in another map, whereas matrix-vector multiplication is a linear map acting on a vector, resulting in another vector. But I’m wondering if they are related in some way. The particular analogy that got me thinking was how $\mathbb{R}$ can be thought of as both a vector space and a field.

Best Answer

If $v=(x_1,\ldots,x_n)\in\Bbb R^n$ is a vector, then the map$$\begin{array}{rccc}L_v\colon&\Bbb R&\longrightarrow&\Bbb R^n\\&t&\mapsto&(tx_1,\ldots,tx_n)\end{array}$$is a linear map and every linear from $\Bbb R$ into $\Bbb R^n$ can be obtained by this process. The matrix of $L_v$ with respect to the standard bases is$$\begin{bmatrix}x_1\\x_2\\\vdots\\x_n\end{bmatrix}.\tag1$$If $A$ is the matrix of a linear map from $\Bbb R^n$ into $\Bbb R^m$, then the composition of $A$ with $L_v$ is a linear map from $\Bbb R$ into $\Bbb R^m$, with corresponds to the vector $A.v$. But to compute $A.v$ is to compute the product of $A$ by the matrix $(1)$. So, yes, $A.v$ can be seen is a composition of linear maps.