[Math] Sum of entries of a matrix

linear algebramatricesnoncommutative-algebra

For a matrix $A \in \mathbb{R}^{n \times n}$, it is clear that the sum of all the entries of $A$ can be expressed as
$$\vec{1}^{T} A \vec{1} = \sum \limits_{i,j} A_{i,j}$$

Now suppose $A,B \in \mathbb{R}^{n \times n}$ are symmetric matrices. Then by the above expression, it is clear that the sum of the entries of the product $AB$ is the same as that of $BA$, even though the two are distinct as matrices.
So
$$\vec{1}^{T} (AB+BA) \vec{1}=2\vec{1}^{T} AB \vec{1}$$

Do we have any such expression for higher degrees? That is, suppose we form the sum of all possible permutations of a product of $n$ repetitions of $A$ and $m$ repetitions of $B$, and let $Symm(A^nB^m)$ denote this sum. For example, when $n=3$ and $m=2$, the expression has $\binom{5}{2}$ terms as follows
$$Symm(A^3B^2)=A^3B^2 + A^2BAB + A^2B^2A+ABA^2B+ABABA+AB^2A^2+BA^3B+BA^2BA +BABA^2+B^2A^3$$

Can we say anything useful about
$$\vec{1}^{T} Symm(A^nB^m) \vec{1}$$
in terms of $A,B$?


This came up while working on a larger problem, so I've skipped the context here as of now. I apologize if the question is a bit vague and open-ended, and will update it promptly based on any feedback. Thanks.

Best Answer

Let $X$ be the square matrix whose each element is $1$. (Is there canonical notation for this?) This is a symmetric matrix. $\DeclareMathOperator{\tr}{tr}$

The sum of elements of $A$ is $s(A)=\tr(AX)$. For symmetric $A$ and $B$ the sum of elements in the product is $$ s(AB)=\tr(ABX)=\tr((ABX)^T)=\tr(X^TB^TA^T)=\tr(XBA)=\tr(BAX)=s(BA). $$ This you already knew, but I just wanted to give a new point of view to this fact. Similarly, it's easy to check that $s(A^T)=s(A)$ for any matrix $A$.

Given the commutativity properties of the trace — and more importantly, lack thereof — I don't think there will be a nice identity to allow you to treat arbitrary permutations nicely. Matrices don't commute with $X$ in general. It's hard to prove non-existence of useful things to say, but perhaps the trace helps clarify your thoughts.

For example, in your example you can take $A=\begin{pmatrix}0&1\\0&0\end{pmatrix}$ and $B=X$. Then all terms with $A^2$ vanish, and $Symm(A^3B^2)=ABABA=A$. This $A$ is not symmetric; the point is just to emphasize that the order of matrices can have big effects (also on trace: typically $\tr(ABC)\neq\tr(BAC)$), and this is true with or without symmetry. In general, the order of the matrices will have a significant effect on the sum of elements of the product, but it's hard to say more than that. If you have a more specific question, I can try to think of a specific answer.

Related Question