‘Higher order’ complex numbers

complex numbershypercomplex-numberssoft-question

I recently learned that complex numbers can also be represented as matrices of the form: $$\begin{pmatrix} x & -y \\ y & x\end{pmatrix}$$ where complex multiplication corresponds to matrix multiplication and conjugation corresponds to transposition. I thought it was interesting how this representation corresponds to the $2 \times 2$ unit rotation matrix:
$$\begin{pmatrix} \cos \theta & -\sin\theta \\ \sin \theta & \cos \theta \end{pmatrix}$$
I was wondering if, then, there is a higher order version of the complex numbers? e.g. numbers that correspond to the $3 \times 3$ unit rotation matrix:
$$\begin{pmatrix} \sin \theta \cos \phi & \sin \theta \sin \phi & \cos \theta \\ \cos \theta \cos \phi & \cos \theta \sin \phi & – \sin \theta \\ -\sin \phi & \cos \phi & 0\end{pmatrix}$$ etc…

Best Answer

It's a good question, albeit one already long since pondered.

In short: there isn't, at least not one that maintains a lot of the properties we would likely want from such a system.


To clarify, let's first note that the next step up is quaternions, four-dimensional instead of three- or two-dimensional. They have the natural representation as numbers of the form

$$a + b i + cj + dk$$

where

$$i^2 = j^2 = k^2 = ijk = -1$$

We can represent these as matrices (either in $M_{2\times 2}(\mathbb{C})$ or $M_{4 \times 4}(\mathbb{R})$) (Wikipedia reading):

$$\begin{bmatrix}a+bi & c+di \\ -c + d i & a - b i \end{bmatrix}$$

for complex matrices, and one possible (non-unique, among $48$ total) representation over $\mathbb{R}$ by

$$\begin{align*} &\begin{bmatrix} a & -b & -c & -d \\ b & a & -d & c \\ c & d & a & -b \\ d & -c & b & a \end{bmatrix}\\ &= a \begin{bmatrix} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \end{bmatrix} + b \begin{bmatrix} 0 & -1 & 0 & 0 \\ 1 & 0 & 0 & 0 \\ 0 & 0 & 0 & -1 \\ 0 & 0 & 1 & 0 \end{bmatrix} + c \begin{bmatrix} 0 & 0 & -1 & 0 \\ 0 & 0 & 0 & 1 \\ 1 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \end{bmatrix} + d \begin{bmatrix} 0 & 0 & 0 & -1 \\ 0 & 0 & -1 & 0 \\ 0 & 1 & 0 & 0 \\ 1 & 0 & 0 & 0 \end{bmatrix} \end{align*}$$


Loosely, Hamilton (the discoverer of quaternions) sought a way to multiply and divide three-dimensional numbers, in the same sense we can do this for two-dimensional numbers.

More specifically: in $\mathbb{C}$, we can naturally add and multiply numbers together in the usual way, and even subtract and divide them. The complex numbers form what we call a "field" owing to its properties; another, and more relevant, term is that of "associative division algebra".

Hamilton wondered if we could extend these notions to a three-dimensional version.

In his attempts, issues arose however in trying to make division (or multiplying by a multiplicative inverse) work out for all elements of the set. As it turns out, this is provably impossible, through the fact that there is no three-dimensional associative division algebra over $\mathbb{R}$ (Frobenius' theorem & proof).


Granted, your question seems a little more grounded in the language of matrices. Let's notice something: if we let $\sin \theta = x, \cos \theta = y$, then your matrix representations in the two-dimensional case notably correspond identically (in a loose sense).

Performing this for the three dimensional matrix, you should notice we already need four variables, not three, in a similar sense.

And, of course, if we wish to keep this grounded towards matrices and not necessarily in the language of division algebras... Well, I don't have a conclusive answer to offer you there.

What I can tell you is that the system of numbers such matrices would naturally correspond to would be, in some way or another ... well, jank. They would be missing properties we typically like to have.

Whether you like to have those properties is, of course, entirely up to you. There's nothing stopping such a thing from being useful or interesting after all, in the right context.

Related Question