[Math] Necessary condition for matrix multiplication commutative (and if for permutation matrix)

matrices

This is a very old and popular question: the following link is the similar question

When is matrix multiplication commutative?

And there is a very famous theorem:

If $A,B$ are simultaneously diagonalizable $\rightarrow$ $AB=BA$

However, the reverse is not true; for example:

$A = \begin{bmatrix}0 & 1 \\1 & 0\end{bmatrix}
$ and $B = \begin{bmatrix}1 & 0 \\0 & 1\end{bmatrix}$.

My question is:

Is there any necessary but not sufficient condition to have $AB=BA$?

i.e., if $AB=BA$, then this condition must hold but not vice versa.

Another question is:

If I only focus on the permutation matrix, which will be applied for the study of the commutative group, can I say any necessary condition for $AB=BA$?

Best Answer

  1. Let $A,B\in M_n(K)$ where $K$ is algebraically closed. Then $AB=BA\implies$ $A,B$ are simultaneously triangularizable $\implies$ $A,B$ have property $L$ of Motzkin, Taussky (that is, there are orderings of $spectrum(A)=(\lambda_i),spectrum(B)=(\mu_i)$ s.t. for every $t\in K$, $spectrum(A+tB)=(\lambda_i+t\mu_i)$). The converses are false.

EDIT. 2. About two permutations $\sigma,\tau$, the result is well-known when we write $\sigma=c_1\circ\cdots \circ c_k$ as a product of disjoint cycles. Then $\sigma\circ\tau=\tau\circ\sigma$ iff there is a permutation $c_{i_1},\cdots,c_{i_k}$ of the $(c_j)$ s.t., for every $j$, the image by $\tau$ of a support of $c_j$ is a support of $c_{i_j}$.

Example. $\sigma=(1234)(5678),\tau=\begin{pmatrix}1&2&3&4&5&6&7&8\\6&7&8&5&3&4&1&2\end{pmatrix}$. By definition, a support of $(1234)$ is one of the $4$ sequences $[1,2,3,4],[2,3,4,1],[3,4,1,2],[4,1,2,3]$.

  1. Motzkin, Taussky proved this pretty converse: let $A,B\in M_n(\mathbb{C})$ be diagonalizable s.t., for every $t\in \mathbb{C}$, $A+tB$ is diagonalizable; then $AB=BA$.
Related Question