[Math] Non-integral powers of a matrix

complex numberseigenvalues-eigenvectorsexponentiationjordan-normal-formlinear algebra

Question

Given a square complex matrix $A$, what ways are there to define and compute $A^p$ for non-integral scalar exponents $p\in\mathbb R$, and for what matrices do they work?

My thoughts

Integral exponents

Defining $A^k$ for $k\in\mathbb N$ is easy in terms of repeated multiplication, and works for every matrix. This includes $A^0=I$. Using $A^{-1}$ as the inverse, $A^{-k}=\left(A^{-1}\right)^k$ is easy to define, but requires the matrix to be invertible. So much for integral exponents.

Rational definition

I guess for a rational exponent, one could define

$$A^{\frac pq}=B\quad:\Leftrightarrow\quad A^p=B^q$$

This will allow for more than one solution, and I'm not sure if the computations I'll describe below will find all solutions satisfying the above equation. So I'm not sure whether that's a reasonable definition. For non-rational exponents, a limit using a convergent series of rational exponents might work.

Diagonalizable computation

If $A$ is diagonalizable, then one has $A=W\,D\,W^{-1}$ for some diagonal matrix $D$. One can simply raise all the diagonal elements to the $p$-th power, obtaining a matrix which will satisfy the above equation. For each diagonal element, I'd define $\lambda^p=e^{(p\ln\lambda)}$, and since $\ln\lambda$ is only defined up to $2\pi i\mathbb Z$, this allows for multiple possible solutions. If one requires $-\pi<\operatorname{Im}(\ln\lambda)\le\pi$, then the solution should be well defined, and I guess this definition even has a name, although I don't know it.

Non-diagonalizable computation

If $A$ is not diagonalizable, then there is still a Jordan normal form, so instead of raising diagonal elements to a fractional power, one could attempt to do the same with Jordan blocks. Unless I made a mistake, this appears to be possible. At least for my example of a $3\times3$ Jordan block, I was able to obtain a $k$-th root.

$$
\begin{pmatrix}
\lambda^{\frac1k} & \tfrac1k\lambda^{\frac1k-1} & \tfrac{1-k}{2k^2}\lambda^{\frac1k-2} & \\
0 & \lambda^{\frac1k} & \tfrac1k\lambda^{\frac1k-1} \\
0 & 0 & \lambda^{\frac1k}
\end{pmatrix}^k
=
\begin{pmatrix}
\lambda & 1 & 0 \\
0 & \lambda & 1 \\
0 & 0 & \lambda
\end{pmatrix}
$$

If the eigenvalue $\lambda$ of this block is zero, then the root as computed above would be the zero matrix, which doesn't result in a Jordan block. But otherwise it should work.

Conclusion

Edited since this question was first asked.

So it seems that every invertible matrix can be raised to every rational power, as long as uniqueness is not a strong requirement. A non-invertible matrix apparently can be raised to non-negative powers as long as all Jordan blocks for eigenvalue zero have size one.

Is this true? If not, where is my mistake? If it is, is there a good reference for this?

Best Answer

As @tom pointed out in a comment, the power of a matrix can be defined in terms of logarithm of a matrix and matrix exponential, using

$$A^p:=\exp\left(p\ln A\right)$$

Using the principal logarithm (this is the name for that choice described in the question without giving a name), the above even yields unique results.

The matrix exponential is defined for every matrix, the matrix logarithm only for invertible matrices. The case of singular matrices mentioned in the question is therefore not covered by this definition.

Related Question