[Math] Block diagonal matrix diagonalizable

block matricesdiagonalizationlinear algebramatrices

I am trying to prove that:


The matrix $C = \left(\begin{smallmatrix}A& 0\\0 & B\end{smallmatrix}\right)$ is diagonalizable, if only if $A$ and $B$ are diagonalizable.


If $A\in GL(\mathbb{C}^n)$ and $B\in GL(\mathbb{C}^m)$ are diagonalizable, then is easy to check the $C\in GL(\mathbb{C}^{n+m})$ is diagonalizable. But if I suppose that $C$ is diagonalizable, then exists $S = [S_1, S_2, \ldots, S_{n+m}]$, $S_i\in\mathbb{C}^{m+n}$, such that $S^{-1}CS = \mbox{diag}(\lambda_i)$ . Now $CS_i = \lambda_iS_i$, and if $S_i = \left(\begin{smallmatrix}x_i\\y_i\end{smallmatrix}\right)$, $x_i\in\mathbb{C}^n$ and $y_i\in\mathbb{C}^m$, then
$$Ax_i = \lambda_ix_i\quad\mbox{ and }\quad By_i = \lambda_iy_i.$$
So, if I can justify that $\{x_1,\ldots,x_{n+m}\}$ have exactly $n$ linear independent vectors and $\{y_1,\ldots,y_{n+m}\}$ have $m$ linear independent vectors, I will prove that $A$ and $B$ are diagonalizables, but I don't know how to prove that? Please, anyone have an idea? Thanks in advance.

Best Answer

Short answer: the minimal polynomial of $C$ is the monic lcm of the minimal polynomials of $A$ and $B$. And a square matrix is diagonalizable if and only if its minimal polynomial splits (which is automatic in $\mathbb{C}$ of course) with only simple roots. In other words, as pointed out by DonAntonio: if and only if its minimal polynomial is the product of pairwise distinct monic linear factors. Over the field under consideration, of course.

Now I'll give a detailed argument without explicit use of minimal polynomials.

Fact: a square matrix $M$ with coefficients in a field $K$ is diagonalizable if and only if there exists a nonzero polynomial $p(X)\in K[X]$ which splits over $K$ with simple roots and such that $p(M)=0$.

Proof: if $M$ is diagonalizable and if $\{\lambda_1,\ldots,\lambda_k\}$ is the set of its (non repeated) eigenvalues, then $p(X)=(X-\lambda_1)\cdots(X-\lambda_k)$ annihilates $M$. Conversely, if such a polynomial $p(X)$ with $\lambda_j$ pairwise distinct annihilates $M$, we have (by Bezout, essentially): $K^n=\mbox{Ker } p(M)=\bigoplus_{j=1}^k\mbox{Ker } (M-\lambda_j I_n)$. Diagonalizability follows easily. QED.

Now for every polynomial $p(X)$, you have $$ p(C)=\left(\matrix{p(A)&0\\0&p(B)}\right) $$ This gives you the annoying direction, namely $C$ diagonalizable implies $A$ and $B$ diagonalizable.

The converse is easier. Take $P$ and $Q$ invertible such that $PAP^{-1}$ and $QBQ^{-1}$ be diagonal. Then $$ R:=\left(\matrix{P&0\\0&Q}\right) $$ is invertible with $$ R^{-1}=\left(\matrix{P^{-1}&0\\0&Q^{-1}}\right)\qquad \mbox{and}\qquad RCR^{-1}=\left(\matrix{PAP^{-1}&0\\0&QBQ^{-1}}\right) $$ is diagonal.

Note: you can also do the converse with the fact above. Just take the lcm of the minimal polynomials of $A$ and $B$.