[Math] Significance of Matrix-Vector multiplication

linear algebramatrices

Can someone give me an example illustrating physical significance of the matrix-vector multiplication?

  1. Does multiplying a vector by matrix transforms it in some way?
  2. Do left & right multiplication signify two different things?
  3. Is matrix a scalar thing? (EDIT)

Thank you.

Best Answer

The physical significance depends on the matrix. The primary point is that multiplication by a matrix (in the usual sense, matrix on the left) represents the action of a linear transformation. We'll work with one basis throughout which we'll use to represent our matrices and our vectors. Just note that because of linearity, if we have a vector $x = c_1e_1 + \ldots + c_ne_n$, and a linear transformation $L$, then $$ \begin{eqnarray*} L(x) &=& L(c_1e_1 + \ldots + c_ne_n) \\ &=& L(c_1e_1) + \ldots + L(c_ne_n) \\ &=& c_1L(e_1) + \ldots + c_nL(e_n) \end{eqnarray*}. $$

This means that any linear transformation is uniquely determined by its effect on a basis. So to define one, we only need to define its effect on a basis. This is the matrix

$$ \left(L(e_1) \ldots L(e_n)\right) = \left( \begin{array}{c} a_{11} & \ldots & a_{1n} \\ \ldots & \ldots & \ldots \\ a_{n1} & \ldots & a_{nn} \end{array} \right) $$

where $a_{ij}$ is the $i$'th compenent of $L(e_j)$.

Let's call this matrix $M_L$. We want to define multiplication of $M_L$ and some vector $x$ so that $M_L \cdot x = L(x)$. But there's only one way to do this. Because the $j$'th column of $M_L$ is just $L(e_j)$ and in light of our decomposition of the action of $L$ in terms of the $L(e_j)$, we can see that

$$ M_L \cdot x = \left( \begin{array}{c} a_{11} & \ldots & a_{1n} \\ \ldots & \ldots & \ldots \\ a_{n1} & \ldots & a_{nn} \end{array} \right) \cdot \left( \begin{array}{c} c_1 \\ \ldots \\ c_n \end{array} \right) $$

must equal

$$ c_1\left( \begin{array}{c} a_{11} \\ \ldots \\ a_{n1} \end{array} \right) + \ldots + c_n\left( \begin{array}{c} a_{1n} \\ \ldots \\ a_{nn} \end{array} \right) = \left( \begin{array}{c} c_1a_{11} + \ldots + c_na_{1n} \\ \ldots \\ c_1a_{n1} + \ldots + c_na_{nn} \end{array} \right) $$

which is the standard definition for a vector left-multiplied by a matrix.

EDIT: In response to the question "Is a matrix a scalar thing". Kind of but no.

If you consider the most basic linear equation in one variable, $y = mx$, where everything in sight is a scalar, then a matrix generalizes the role played by $m$ to higher dimensions and a vector generalizes the role played by $y$ and $x$ to higher dimensions. But matrices don't commute multiplicatively. So that's one big thing that's different. But they're strikingly similar in a lot of ways. We can define the function of matrices $f(A) = A^2$ and we can differentiate it with respect to $A$. When we do this in one variable with the map $f(x) = x^2$, we get the linear map $f_x'(h) = 2xh$ but when we do it with matrices, we get the linear map $f_A'(H) = AH + HA$. If matrices commuted, then that would just be $2AH$!

EDIT2:

My "add comment" button isn't working for some reason. The $e_j$'s are a basis, $e_1, \ldots, e_n$. I think the best thing to do would be to wait for your teacher to get around to it. I sometimes forget that people introduce matrices before vector spaces and linear transformations. It will all make much more sense then. The main point of a basis though is that it's a set of vectors so that every vector in the given space can be written as a unique linear combination of them.

Related Question