Proof about block matrices

block matricesdiagonalizationeigenvalues-eigenvectorslinear algebramatrices

During a test today I had this question:

Given $$ M = \begin{pmatrix} A & C \\ 0 & B\\ \end{pmatrix}$$ where $A$ and $B$ are $n \times n$ diagonalizables matrices without eigenvalues in common, prove that $M$ is diagonalizable.


No information about $C$ was given. First, I tried

$$\det(M – \lambda I) = \det(A – \lambda I)\cdot \det(B – \lambda I)$$

So

$$p_M(\lambda)=p_A(\lambda)\cdot p_B(\lambda)$$

So the set of eigenvalues of $M$ is the union of the eigenvalues of $A$ and $B$ (given they don't have any in common). My next step was to do

$$ M^k = \begin{pmatrix}
A^k & C'\\
0 & B^k\\ \end{pmatrix}$$

so if $m_M(x)$ is the minimal polynomial of $M$, we have that $m_A(x)|m_M(x)$ and $m_B(x)|m_M(x)$. But how can I conclude that hence the minimal polynomial of M will have just linear factors? I know that $m_M(x) = m_A(x)\cdot m_B(x)\cdot Q(x)$, but how can I show that $Q(x) = 1$? If $m_M(x) = m_A(x)\cdot m_B(x)$ it´s clear that M is diagonalizable, but I can´t see how to prove this.

In fact, in the end my approach was the same of the first answer, but after I still looking to a solution using the minimal polynomial.

Best Answer

The eigenvalues are union of eigen values of $A,B$ and the matrix is diagonalizable. This can be seen as follows:

Let $v$ be an eigen vector of $A$ then $[v;0]$ is an eigen vector of $M$. Now if $w$ is an eigen vector of B then $M[w';w] =[Aw'+Cw, Bw]= [Aw'+Cw, \lambda_B w]$. we need $ [Aw'+Cw, \lambda_B w] = [\lambda_B w';\lambda_B w]$. Now solve $Aw'+Cw = \lambda_B w'$=>$(A-\lambda_B I)w' = -Cw$. Since $\lambda_B$ is not an eigen value of $A$, $A-\lambda_B I$ is non-singular. So we can solve for $w'$. So we have produced all the linearly independent eigen vectors for $M$. This proves the diagonalizability.

Related Question