Transforming a matrix into block diagonal structure with particular eigenvalue order

linear algebramatricesmatrix decomposition

Problem: Suppose $A \in \mathbb{R}^{n \times n}$ has eigenvalues $\{\lambda_1,….,\lambda_n\} \subset \mathbb{C}$ (not necessarily distinct) where $Re\{\lambda_i\} \le Re\{\lambda_{i+1}\}$ for $i \in \{1,…,n-1\}$. Show that for any $k \in \{1,…,n-1\}$ there exists an invertible matrix $T \in \mathbb{R}^{n \times n}$ such that:

$TAT^{-1}=\begin{bmatrix} A_1 & 0 \\ 0 & A_2 \end{bmatrix}$, where the matrix $A_1 \in \mathbb{R}^{k \times k}$ has eigenvalues $\{\lambda_1,….,\lambda_k\} \subset \mathbb{C}$ and $A_2 \in \mathbb{R}^{(n-k) \times (n-k)}$ has eigenvalues $\{\lambda_{k+1},….,\lambda_n\} \subset \mathbb{C}$.

Attempted solution: We know that $A$ is similar to a real Jordan canonical form that has this diagonal block-like structure. Moreover, we can identify the eigenvalues associated with each block by considering the characteristic polynomial, which can be found by taking advantage of the Jordan form structure when evaluating the determinant.

However, the exact sizes of the blocks depend upon the algebraic multiplicity and geometric multiplicity of the eigenvalues. Therefore it seems difficult to show the sizes of the block diagonals are such that $A_1 \in \mathbb{R}^{k \times k}$ and $A_2 \in \mathbb{R}^{(n-k) \times (n-k)}$. Furthermore, in most textbooks, it is shown that $A$ is similar to some Jordan form, ignoring the order of eigenvalues. Of course, once we transform onto a Jordan form we could permute the rows to recapture the order the eigenvalues should appear in but then we will lose the block-like structure.

According to this question, it does seem that the order in which the Jordan blocks appear does not matter since they're all similar:

Does the Jordan form of the matrix depend on where the Jordan blocks are placed?

Question: Is there an alternative way to prove this (possibly without using Jordan canonical forms)? This would allow us to avoid reproducing the same/similar argument used in showing the existence Jordan canonical forms, worrying about the order in which the eigenvalues for certain blocks appear and how the algebraic and geometric multiplicity of each eigenvalue affects the block sizes.

Best Answer

Let $S_1 = \{\lambda_1,\dots,\lambda_k\},S_2 = \{\lambda_{k+1},\dots,\lambda_m\}$ be disjoint sets comprising the (distinct!) eigenvalues of $A$ such that $\lambda \in S_1 \implies \bar \lambda \in S_1$. Let $p(x) = (x-\lambda_1)\dots(x-\lambda_k)$ and $q(x) = (x-\lambda_{k+1})\dots(x-\lambda_m)$. Because of the way that $S_1,S_2$ were defined, both $p,q$ are polynomials with real coefficients.

As a consequence of the Cayley-Hamilton theorem, it holds that $[p(A)]^n[q(A)]^n = 0$. Note that $U_1 = \ker(p(A)),U_2 = \ker(q(A))$ are invariant subspaces of $\Bbb R^n$.

Claim: $\Bbb R^n = U_1 \oplus U_2$.

Proof of claim: Note that because $p,q$ are relatively prime, there exist polynomials $f,g$ such that $f(x)p(x) + g(x)q(x) = 1$, from which it follows that $f(A)p(A) + g(A)q(A) = I$.

To see that $U_1,U_2$ are disjoint (i.e. have intersection $\{0\}$), note that if $v \in U_1 \cap U_2$, it follows that $$ v = Iv = [f(A)p(A) + g(A)q(A)]v = f(A)[p(A)v] + g(A)[q(A)v] = 0. $$ To see that $U_1 + U_2 = \Bbb R^n$, note that any $v$ can be decomposed into $$ v = q(A)g(A)v + p(A)f(A)v. $$ Because $p(A)q(A) = q(A)p(A) = 0$, it is easy to see that $q(A)g(A)v \in U_1$ and $p(A)f(A)v \in U_2$. $\square$

Now, if $v_1,\dots,v_d$ is a basis of $U_1$ and $v_{d+1},\dots,v_n$ is a basis of $U_2$, it follows that the matrix of $A$ relative to the basis $\{v_1,\dots,v_n\}$ has the desired block-diagonal form.

Related Question