[Math] Existence of square root of a matrix

linear algebramatrices

Testing a method with the use of C.-H. theorem for finding square roots of real $ 2 \times 2 $ matrices I have noticed that some matrices probably don't have their square roots with real and complex entries.

An example the matrix $A= \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}$.

However is it at all a proof that it is impossible to extend somehow the field of entries in order to satisfy equation $B^2=A$ similar to the situation when many years ago solution of $a^2=-1$ seemed to be impossible to solve for real numbers hence imaginary numbers $i$ were introduced ?

Is it possible to devise such numbers (…quaternions? octonions ? or others..) that $B^2=A$ would be however satisfied ?

Additionally, when we are sure in general case for $n \times n$ matrices that a square root exists if we are free to vastly extend a field?

Best Answer

Two partial answers to your question make a full answer!

Let $A=\begin{bmatrix}0&1\\0&0\end{bmatrix}$.

  • Answer 1: The set of matrices form a ring (rings are sets where you have algebraic operations of addition and multiplication). In abstract algebra, one learns about ring extensions, in other words, you can construct a bigger ring which contains the ring of matrices, but your given matrix $A$ has a square root. This is easier to do in a commutative ring, but matrix multiplication is not commutative.

    In this case, let $M_{2,\mathbb{R}}$ be the set of $2\times 2$ matrices with coefficients in $\mathbb{R}$, and we consider elements which are sums of matrices times $B$. In other words, you have sums whose terms are like $$ BM_1BM_2BM_3B\cdots M_kB $$ with or without the leading $B$'s. The one extra condition is that $B^2=A$.

    This gets complicated, but does include a square root of $B$. The problem is that $B$ is not a matrix, it's just an extra element in the ring that acts like the square root $A$.

  • Answer 2: Suppose that $B$ must be a matrix (and we're working in a field of characteristic $0$). Then, we have the situation $$ \begin{bmatrix}0&1\\0&0\end{bmatrix}=\begin{bmatrix}a&b\\c&d\end{bmatrix}\begin{bmatrix}a&b\\c&d\end{bmatrix}=\begin{bmatrix}a^2+bc&ab+bd\\ac+cd&bc+d^2\end{bmatrix}. $$ Let's start with the lower left corner. We have that $c(a+d)=0$, so either $c=0$ or $a=-d$.

    • Let's start with the case where $c=0$. In this case, the matrix on the RHS simplifies to $$ \begin{bmatrix}a^2&ab+bd\\0&d^2\end{bmatrix} $$ Since the upper left and lower right corners are also $0$, $a^2=0$ and $d^2=0$, so $a=0$ and $d=0$. But this makes the upper right corner $0$ as well, a contradiction.

    • Suppose now that $a=-d$, but then the upper right corner is $b(a+d)=0$, which is also not possible.

    Therefore, if $xy=0$ implies that $x=0$ or $y=0$, then there is no way to write $A$ as a square of a matrix no matter what field you work with (assuming in our field that $0\not=1$).

Concluding remark: If you work with matrices over a commutative ring which has zero divisors (so not an integral domain), then it may be possible to find a matrix which is the square root of $A$.