Linear Algebra – Why C Commuting with Matrices A and B Implies C is a Scalar Multiple of Identity

linear algebramatrices

I'm self studying Steven Roman's Advanced Linear Algebra, and this is problem 10 of Chapter 8.

Let $A,B\in M_2(\mathbb{C})$, $A^2=B^3=I$, $ABA=B^{-1}$, but $A\neq I$ and $B\neq I$. If $C\in M_2(\mathbb{C})$ commutes with $A$ and $B$, then $C=rI$ for some $r\in\mathbb{C}$.

Is there a way to solve this without writing out arbitrary matrices and attempting to solve a huge system of equations? The only thing I observe is that $A=A^{-1}$, so $B\sim B^{-1}$, so $B$ and $B^{-1}$ have the same characteristic polynomial. I'm stymied trying to show $C$ is diagonal, let alone a multiple of $I$. Thanks for any ideas.

I should add that I know that the center $Z(M_n(\mathbb{C}))$ consists of scalar multiples of $I$, but I don't see any reason to assume or prove $C$ commutes with everything.

Best Answer

Note that $B$ must have both primitive cube roots of unity as eigenvalues, as $B$ is conjugate to its inverse and has order $3$. Then note that $A$ and $B$ can have no common eigenvector, since if $Av = \lambda v$ and $Bv = \omega v,$ then $\lambda^{2} = 1$, so $ABAv = \lambda^{2}\omega v = \omega v,$ while $ABAv = \omega^{-1}v,$ a contradiction.

This means that there is no $1$-dimensional subspace which is left invariant by both $A$ and $B.$ But if $\mu$ is an eigenvalue of $C,$ and $A$ and $B$ commute with $C,$ then the $\mu$-eigenspace of $C$ is invariant under both $A$ and $B,$ so must be two-dimensional, and $C = \mu I.$

This is really an instance of Schur's Lemma from representation theory.