Can we reduce finding matrix roots to finding roots of Jordan blocks

jordan-normal-formlinear algebramatricespolynomials

I just found some interesting question about matrix square roots and I came to think of one way to find them, or at least reduce them to a set of simpler problems.

Assume we have a matrix $\bf A$ and it can be put on some Jordan form:

$${\bf A} = {\bf SJS}^{-1}$$

Where $\bf J$ is block-diagonal consisting of famous Jordan blocks of the shape:

$${\bf J_{k}} = \begin{bmatrix} \lambda_k&1&0&\cdots&0\\0&\ddots&\ddots& & 0\\\vdots&\ddots&\ddots&\ddots&\vdots\\0&\cdots&0&\lambda_k&1\\0&\cdots&\cdots&0&\lambda_k\end{bmatrix}$$

In other words main diagonal full with eigenvalue $\lambda_k$ and first off-diagonal filled with ones.

The problem of finding some n'th root to $\bf A$ can now be written $${\bf SJ}^{1/n}{\bf S}^{-1}$$ (Why?).

So If I am correct so far.. we have reduced down to find some way of calculating square root of such Jordan blocks $\bf J_k$, and in simplest case blocks of dimensionality 1, finding some root over our scalar field (for the eigenvalues themselves).

  1. Firstly, is this reasoning correct so far?

  2. Secondly, how can we approach finding square root to matrices of the form $\bf J_k$. Is there some simplification or short-cut that can be done?

Best Answer

Assume that the underlying field is algebraically closed and that we know the Jordan decomposition of $A$; thus we may assume that $A=diag(\lambda_1 I_{i_1}+J_1,\cdots,\lambda_k I_{i_k}+J_k)$ where $J_r$ is the nilpotent Jordan block of dimension $i_r$.

$\bullet$ The simple case is when $A$ is invertible and cyclic (that is, the eigenvalues $(\lambda_r)$ are distinct and non-zero); then $A$ admits exactly $2^k$ square roots: $diag(\pm L_1,\cdots,\pm L_k)$ where

$L_r=\sqrt{\lambda_r}(I_r+(1/\lambda_r) J_r)^{1/2}$ and where the second factor is given by the Taylor's development, wrt $x$, of $(1+(1/\lambda_r) x)^{1/2}$.

$\bullet$. Otherwise there are supplementary solutions -or eventually no solutions when $A$ is singular-.

When $A$ is not invertible, cf. my post in

sufficient and necessary conditions for matrix to have pth roots

When $A$ is not cyclic, consider the case when $A=diag(I_2+J,I_2+J)$ (find in the first part $C(A)$).

EDIT. More precisely (for the above example) $C(A)$ is the vector space of dimension $8$ constituted by the matrices in the form $\begin{pmatrix}U_1&U_2\\U_3&U_4\end{pmatrix}$ where $U_j$ is in the form $a_j I_2+b_j J$. A particular square root of $A$ is

$\begin{pmatrix}1/2&1&1&2/\sqrt{3}\\0&1/2&0&1\\3/4&-\sqrt{3}/2&-1/2&-1\\0&3/4&0&-1/2\end{pmatrix}$.

Related Question