[Math] Matrix multiplication of columns times rows instead of rows times columns

linear algebramatrices

In ordinary matrix multiplication $AB$ where we multiply each column $b_{i}$ by $A$, each resulting column of $AB$ can be viewed as a linear combination of $A$.

If however if we decided to multiply each column of $A$ by each row of $B$, we get an entire matrix for each column-row multiply. My question is: Does each matrix resulting from an outer product have any known meaning aside from being a part of the sum(a summand?) of the final $AB$?

Edit: Say we have $AB$

$$ \left( \begin{array}{ccc}
1 & 2 & 3 \\
4 & 5 & 6 \\
7 & 8 & 9 \end{array} \right) \left( \begin{array}{ccc}
10 & 11 & 12 \\
13 & 14 & 15 \\
15 & 16 & 17 \end{array} \right) $$

Normally we would multiply each column of $B$ by A and get a linear combination of A, e.g.
$$10\left( \begin{array}{c}
1 \\ 4\\ 7 \end{array} \right)+ 13\left( \begin{array}{c}
2 \\ 5\\ 8 \end{array} \right)+ 15\left( \begin{array}{c}
3 \\ 6\\ 9 \end{array} \right)$$ which is one column of $AB$.

If however we multiply each column of $A$ by each row of $B$, e.g.
$$\left( \begin{array}{c}
1 \\ 4\\ 7 \end{array} \right)\left( \begin{array}{ccc}
10 & 11 & 12 \end{array} \right)$$ we get a matrix. Each of the 3 matrices $a_{i}b_{i}^{T}$ summed together gives us $AB$. I was wondering if each individual matrix that sums to $AB$ has any sort of special meaning. This second way of performing multiplication also seems to be called column-row expansion. (http://www.math.nyu.edu/~neylon/linalgfall04/project1/dj/crexpansion.htm). I actually read about it in I believe section 2.4 of Strang's Introduction to Linear Algebra book. He mentions that not everybody is aware that matrix multiplication can be performed in this way.

Best Answer

Before talking about multiplication of two matrices, let's see another way to interpret matrix $A$. Say we have a matrix $A$ as below, $$ \begin{bmatrix} 1 & 2 & 3 \\ 1 & 1 & 2 \\ 1 & 2 & 3 \\ \end{bmatrix} $$ we can easily find that column $\begin{bmatrix} 3 \\ 2 \\ 3 \\\end{bmatrix}$ is linear combination of first two columns. $$ 1\begin{bmatrix} 1 \\ 1 \\ 1\\\end{bmatrix} + 1\begin{bmatrix} 2 \\ 1 \\ 2\\\end{bmatrix} = \begin{bmatrix} 3 \\ 2 \\ 3 \\\end{bmatrix} $$ And you can say $\begin{bmatrix} 1 \\ 1 \\ 1 \\\end{bmatrix}$ and $\begin{bmatrix} 2 \\ 1 \\ 2 \\\end{bmatrix}$ are two basis for column space of $A$.

Forgive the reason why you want to decompose matrix $A$ at first place like this, $$ \begin{bmatrix} 1 & 2 & 3 \\ 1 & 1 & 2 \\ 1 & 2 & 3 \\ \end{bmatrix} = \begin{bmatrix} 1 & 0 & 1 \\ 1 & 0 & 1 \\ 1 & 0 & 1 \\ \end{bmatrix} + \begin{bmatrix} 0 & 2 & 2 \\ 0 & 1 & 1 \\ 0 & 2 & 2 \\ \end{bmatrix} $$ but you can, and in the end, it looks reasonable.

If you view this equation column wise, each $column_j$ of $A$ is the sum of corresponding $column_j$ of each matrix in RHS.

What's special about each matrix of RHS is that each of them is a rank 1 matrix whose column space is the line each base of column space of $A$ lies on. e,g. $ \begin{bmatrix} 1 & 0 & 1 \\ 1 & 0 & 1 \\ 1 & 0 & 1 \\ \end{bmatrix} $ spans only $\begin{bmatrix} 1 \\ 1 \\ 1 \\\end{bmatrix}$. And people say rank 1 matrices are the building blocks of any matrices.

If now you revisit the concept of viewing $A$ column by column, this decomposition actually emphasizes the concept of linear combination of base vectors.

If these make sense, you could extend the RHS further, $$ \begin{bmatrix} 1 & 2 & 3 \\ 1 & 1 & 2 \\ 1 & 2 & 3 \\ \end{bmatrix} = \begin{bmatrix} 1 \\ 1 \\ 1 \\\end{bmatrix} \begin{bmatrix} 1 & 0 & 1 \\\end{bmatrix} + \begin{bmatrix} 2 \\ 1 \\ 2 \\\end{bmatrix} \begin{bmatrix} 0 & 1 & 1 \\\end{bmatrix} $$ Each term in RHS says take this base, and make it "look like" a rank 3 matrix.

And we can massage it a little bit, namely put RHS into matrix form, you get $$ \begin{bmatrix} 1 & 2 & 3 \\ 1 & 1 & 2 \\ 1 & 2 & 3 \\ \end{bmatrix} = \begin{bmatrix} 1 & 2 \\ 1 & 1 \\ 1 & 2 \\ \end{bmatrix} \begin{bmatrix} 1 & 0 & 1 \\ 0 & 1 & 1 \\ \end{bmatrix} $$

Now you can forget matrix $A$, and imagine what you have are just two matrices on RHS. When you read this text backward(I mean logically), I hope matrix multiplication in this fashion makes sense to you now. Or if you prefer, you can start with two matrices in the question.