A functional equation of a matrix

determinantfunctional-equationslinear algebramatricespolynomials

How would one prove the following theorem?

$p(A)$ is a nonzero polynomial of the entries of $A$ and satisfies $p(AB)=p(A)p(B)$, for all square matrices $A$ and $B$ of complex numbers. Prove $p(A)=(\det\,A)^k$ for some nonnegative integer $k$.


My attempt:

A matrix $A$ can be Shur-decomposed into $A=QTQ^\dagger$ where $T$ is a triangular matrix and $Q$ a unitary matrix. Now $p(Q)p(Q^\dagger)=p(QQ^\dagger)=p(I)=1$ where the last equation comes from the second paragraph below. Now $p(A)=p(Q)p(T)p(Q^\dagger)=p(T)$. If we can show $p(T)=(\Pi_i\ T_{i,i})^k$ for some integer $k$, then we are done.

Now if $T$ is diagonal with its diagonal entries all equal to $1$ except one being a complex variable $x$, we can show $p(T)=x^k$ for some nonnegative integer $k$. Let $p(T)=\sum_{i=0}^k a_ix^i$ for some nonnegative integer $k$ and $a_k\ne0$. $\sum_{i=0}^k a_ix^{2i}=p(T^2)=p(T)^2=\big(\sum_{i=0}^k a_ix^i\big)^2$. Expand the last expression and collecting coefficients of $\{x^i\}_{i=0}^k$. By the linear independence of $\{x^i\}_{i=0}^k$, the coefficient of the left hand side and that of the right hand side of the same order terms have to match. Considering the coefficents of the terms of order no less than $k$, we draw the following conclusion. The coefficients of the odd order terms have to be zero. $a_k=1$. Recursively we conclude $a_i=0,\,\forall i<k$.

As $p(T_1T_2)=p(T_1)p(T_2)$, $p(T)=\Pi_i\ T_{i,i}$ for any diagonal matrix $T$. But I am unable to proceed further to the triangular matrix.

Best Answer

Let $D_x=(x;1;\ldots;1)$ be a diagonal matrix and define $f(x)=p(D_x)$. Then $f(x)$ is a polynomial in $x$ and $f(xy)=f(x)f(y)$. It follows that $f(x)=x^k$ for some integer $k$. Similarly, let $E_x=(1;x;1;\ldots;1)$ and let $E$ be the elementary matrix that is obtained from the identity matrix by interchanging rows 1 and 2. Since $E^2=I$, one has $p(E)p(E)=p(I)=1$. Since $ED_xE=E_x$ we must have $p(E_x)=p(D_x)=x^k$. It follows that for every diagonal matrix $p(D)=(\det D)^k$.

Let $G$ be an elementary matrix obtained from interchanging rows $i$ and $j$. Then $G^2=I$ and so $p(G)=\pm 1$.

Let $F$ be an elementary matrix obtained from adding $r$ times row $i$ to row $j \neq i$. We show that $p(F)=1$. Let $D(x)$ be a diagonal matrix with entry $x$ on the row $i$ and 1 in other diagonal entries. One has $D(x)FD(1/x) \rightarrow I$ as $x\to\infty$. By the continuity of $p$, we must have $p(F)=1=\det(F)^k$.

Every matrix is a product of elementary matrices. So $p(A)=\pm (\det A)^k$. Since $p(A)$ is a polynomial in the entries of $A$, a choice of sign must be fixed, but $p(A)=-(\det A)^k$ does not satisfy the property. It follows that $p(A)=(\det A)^k$.