Matrices – Intuition of Cyclic Trace Property and Relation to Determinant

determinantmatricestrace

Let $A_1,\ldots,A_n$ be square matrices (not necessarily symmetric). Taking the trace of a matrix product and a cyclic permutation of the product yields:
\begin{align*}
\text{trace}(A_nA_{n-1}\cdots A_1) = \text{trace}(A_1A_{n}\cdots A_{2})
\end{align*}
The above holds for any cyclic permutation (informally, taking an element off the end and appending it onto the other end), but not, in general, for arbitrary permutations of the matrix product.

On the other hand, the determinant of a matrix product satisfies the following property $\text{det}(A_nA_{n-1}\cdots A_1) = \text{det}(A_n)\text{det}(A_{n-1})\cdots \text{det}(A_1)$. Thus,
\begin{align*}
\text{det}(A_nA_{n-1}\cdots A_1) = \text{det}(A_1A_{2}\cdots A_n) = \text{det}(\text{"any permutation of the product of $A$ matrices"})
\end{align*}

The determinant can be thought of as the volume of its matrix argument whereas one can think of the trace as being the derivative of the determinant (near the identity matrix), that is, the infinitesimal change in volume.

What is the intuition behind the trace property only holding for cyclic permutations where the determinant is the same for any permutation of the matrix product? Can this be explained in a way that is related to the trace being the determinant's derivative?

Best Answer

The cyclicity property is exactly the same as the statement that $f(AB)=f(BA)$, where the function $f$ is trace or determinant or some other invariant. In practice all cyclic $f$ are functions of the coefficients of the characteristic polynomial, a polynomial that is the same for $AB$ and $BA$.

From multilinear algebra we know that the coefficients of the characteristic polynomial of operator $X$ that acts on vector space $V$ are (up to a $\pm$ sign) traces of $X$ acting on exterior powers $\Lambda^i(V)$. The top exterior power, with $i = \dim V$, gives the determinant, and is a one-dimensional vector space. Operators on $V$ induce operators of a 1-dimensional vector space -- so that the latter must commute with each other. This is why for $f = \det$, $f(A_1 \cdots A_n)$ is invariant under permutations; the actions of $A_i$ on $\Lambda^{\dim V}$ commute with each other. For the other coefficients such as the trace $\Lambda^i$ has higher dimension and linear operators on the associated exterior powers do not always commute.

It would be nice to know a more concrete combinatorial description of this commutativity, that does not use exterior powers. This is probably the same as the problem of seeing that permutation sign is multiplicative, without using determinants of permutation matrices.