Linear Algebra – Outer Product Using Addition

linear algebramatricestensor-products

I am performing the equivalent of an outer product, except I am using addition instead of multiplication. Let’s look at an example:

$$\begin{bmatrix} 1\\2\\3 \end{bmatrix} + \begin{bmatrix} 4&2&0 \end{bmatrix} = \begin{bmatrix} 5&3&1\\ 6&4&2\\ 7&5&3 \end{bmatrix}$$

What is the appropriate way to perform this operation? What is the mathematical notation for this operation?

I read Why is Matrix Addition defined as element by element addition? but am still struggling to understand why addition (element-wise) and multiplication (outer product) are treated differently, particularly if your data is in this format (i.e., column vector and row vector).

Kronecker sums seem similar to what I am trying to do, but from what I have gathered it is only intended to be used on matrices. Is that true?

Any help would be most appreciated!

Best Answer

What is the appropriate way to perform this operation? What is the mathematical notation for this operation?

I don’t know if it has a name, but you may define your own “outer sum for vectors”: if $x \in \mathcal M_{m\times 1} (\Bbb R)$ is a column vector and $y \in \mathcal M_{1\times n} (\Bbb R)$ is a row vector, then $x \boxplus y \in \mathcal M_{m\times n} (\Bbb R)$ is a matrix such that ${(x \boxplus y)}_{i,j} := x_i+y_j$. (I chosed chosed a plus sign inside a square box for the operator as to visually represent what happens with the operands.)

Kronecker sums seem similar to what I am trying to do, but from what I have gathered it is only intended to be used on matrices. Is that true?

Yes, sadly. And it involves identity matrices. Otherwise you have to come up with another name and another symbol for this Kronecker-esque sum of yours. But since you are already familiar with this operation, you may have noted already that if you swap the identity matrices in the Kronecker sum with all-ones (column) vectors $u$ of proper lengths, you get a matrix-wise form for computing your “outer sum”: $$x \boxplus y = x\otimes u^\mathrm T_n +u_m\otimes y.$$

Please note that since most people treat vectors as columns, they may expect the $y$ after the second $\otimes$ to be transposed: $$x \boxplus y = x\otimes u^\mathrm T_n +u_m\otimes y^\mathrm T.$$

Addendum: I just remembered, since we have column-$\otimes$-row products (or column-by-row Kronecker products), these just simplify to outer products: $$x \boxplus y = xu^\mathrm T_n +u_my^\mathrm T.$$ This latter expression is cleaner than the former, in the sense that it avoids Kronecker products for those unifamiliar with them.

One last equivalent form is $$x \boxplus y = \operatorname{diag}(x) \, u_mu^\mathrm T_n +u_mu^\mathrm T_n\operatorname{diag}(y),$$ but it’s not actually simpler to compute.

Related Question