Linear Algebra – Computing Dimension of Vector Space of Commuting Matrices

linear algebravector-spaces

This is part $2$ of the question that I am working on.

For part $1$, I showed that the space of $5\times 5$ matrices which commute with a given matrix $B$, with the ground field = $\mathbb R$ , is a vector space.

But how can I compute its dimension?

Thanks,

Best Answer

Assume $B$ is in Jordan normal form (real, the complex case is left as exercise). Assume the block $J_i = \lambda_i I + S$ is $n_i \times n_i$ for $1\le i \le p$, where $I$ is the identity and $S$ the shift matrix of suitable size. Then, the space of matrices which commutes with $B$ has dimension $$ \sum_{\substack{i,j=1 \\ \lambda_i = \lambda_j}}^p \min(n_i, n_j). $$

(I hope I found all cases.)

Notes and Proof(outline):

For $k\in\mathbb N$ let $I_k$ denote the $k\times k$ identity matrix and let $S_k$ denote the $k\times k$ shift matrix.

  1. Notice that the operator $C_B(A) = AB - BA$ can be written as matrix vector product using the Kronecker product and the vectorization operator $\operatorname{vec}$: $$ \operatorname{vec} C_B(A) = ((B^T \otimes I) - (I \otimes B)) \operatorname{vec}(A). $$ So, for an explicitly given $B$ you can compute the rank easier.

  2. Also notice that for $B = QJQ^{-1}$, where $Q$ is non-singular, we have $$ \ker C_B = Q\ker C_J Q^{-1}. $$ So we may assume Jordan normal form w.l.o.g.

  3. Let $A\in\mathbb R^{m\times n}$. In case of $m \le n$, by induction, we obtain $$ S_m A = A S_n $$ if any only if $$ A = \begin{bmatrix} 0 & \bar A \end{bmatrix} $$ for some $$ \bar A \in \operatorname{span}\{ I, S_m, S_m^2, \dotsc, S_m^{m-1} \}. $$ Similarly, in case of $m \ge n$, by induction, we obtain $$ S_m A = A S_n $$ if any only if $$ A = \begin{bmatrix} \bar A \\ 0 \end{bmatrix} $$ for some $$ \bar A \in \operatorname{span}\{ I, S_n, S_n^2, \dotsc, S_n^{n-1} \}. $$

  4. For $B = \lambda I_n + S_n\in\mathbb R^{n\times n}$, where $\lambda\in\mathbb R$, we have $$ \ker C_B = \ker C_S = \operatorname{span}\{ I_n, S_n, S_n^2, \dotsc, S_n^{n-1} \}. $$

  5. Let $A = (A_{i,j})_{1\le i,j \le p}$ be a block matrix and $B = \operatorname{diag}(B_1,\dotsc, B_p)$ a block diagonal matrix w.r.t. the same partition and $B_i = \lambda_i I_{n_i} + S_{n_i}$. Then, we have $C_B(A) = 0$ if and only if $$ 0 = A_{i,j} B_j - B_i A_{i,j} = (\lambda_j - \lambda_i) A_{i,j} + (A_{i,j} S_{n_j} - S_{n_i} A_{i,j}) $$ for $1 \le i \le j \le p$. That is, if $A_{i,j}\ne 0$, then $-(\lambda_j - \lambda_i)$ is an eigenvalue of the operator $X\mapsto C_{S_{n_j}, S_{n_i}}(X) := XS_{n_j} - S_{n_i} X$. However, that operator has only the eigenvalue $0$.