[Math] Decompose rotation matrix to plane of rotation and angle

euclidean-geometrygeometrylinear algebramatricesrotations

I would like to decompose an $n$-dimensional orthogonal rotation matrix (restricting to simple rotation with a single plane of rotation) to the two basis vectors of the plane of rotation, and an angle of rotation.

The common method is decomposing the rotation matrix to an axis and angle, but this doesn't work in higher dimensions.

For example in $\mathbb{R}^3$ given the rotation matrix

$R_{xy}=\begin{bmatrix}\cos\theta & -\sin\theta & 0 \\ \sin\theta & \cos\theta & 0 \\ 0 & 0 & 1 \end{bmatrix}$

it's obvious that the plane of rotation is the $xy$-plane spanned by basis vectors $b_0 = (1,0,0)$ and $b_1=(0,1,0)$ and the angle of rotation is $\theta$.
However decomposing it mathematically is rather challenging.

What is the solution for a general (restricted to a single, but arbitrary plane) rotation matrix in $\mathbb{R}^n$?

Best Answer

Given an oriented 2D subspace $\mathsf{\Pi}$ of a real inner product space $V$ and any angle $\theta$, there exists a rotation $R(\mathsf{\Pi},\theta)$ which acts as a rotation by $\theta$ when restricted to $\mathsf{\Pi}$ and acts as the identity map when restricted to the orthogonal complement $\mathsf{\Pi}^\perp$. Since $V=\mathsf{\Pi}\oplus\mathsf{\Pi}^\perp$ is a(n orthogonal) direct sum, every vector is (uniquely) expressible as sum of a vector in $\mathsf{\Pi}$ and a vector in $\mathsf{\Pi}^\perp$, and using linearity this definition allows us to apply $R(\mathsf{\Pi},\theta)$ to any vector. Picking any two orthogonal unit vectors within $\mathsf{\Pi}$ compatible with the orientation and conjoining that with any basis for $\mathsf{\Pi}^\perp$ yields a basis for $V$ with respect to which $R(\mathsf{\Pi},\theta)$ is block diagonal, with the usual $2\times 2$ rotation matrix as one block and the identity matrix of the appropriate dimensions as the other block.

These are called plane rotations. In 3D we usually think of rotations as happening around a rotation axis, however this kind of thinking doesn't generalize to higher dimensions whereas the plane of rotation idea does generalize. Indeed, any rotation $R\in\mathrm{SO}(V)$ is expressible as

$$R=\prod_i R(\mathsf{\Pi}_i,\theta_i) $$

for some oriented, two-dimensional, mutually orthogonal subspaces $\mathsf{\Pi}_1,\cdots,\mathsf{\Pi}_\ell$ and angles $\theta_1,\cdots,\theta_\ell$. (Obviously $\ell\le(\dim V)/2$.) As the $\mathsf{\Pi}$s are orthogonal to each other, the factors in the above product all commute, which is why no order needs to be specified in the product.

Are the set of planes $\{\mathsf{\Pi}_1,\cdots,\mathsf{\Pi}_\ell\}$ an invariant of $R$ or not? Not necessarily. For instance, consider multiplication by $i$ on $\mathbb{C}^2$. Any complex one-dimensional subspace of $\mathbb{C}^2$ (there are a $\mathbb{CP}^1$ worth of them) is a real two-dimensional stable subspace. However, it turns out of the angles $\theta_1,\cdots,\theta_\ell$ are all distinct mod $2\pi$ up to sign, then $\{\mathsf{\Pi}_1,\cdots,\mathsf{\Pi}_\ell\}$ is an invariant.

Indeed, notice that $R^{-1}$ acts the same way but with opposite angles. With a simple picture we can see that $R+R^{-1}$ acts as the scalar $2\cos(\theta_i)$ on $\mathsf{\Pi}_i$. Therefore, $\mathsf{\Pi}_i$ is precisely the $2\cos(\theta_i)$-eigenspace of $R+R^{-1}$. This may not be computationally practical, perhaps more useful for finding the $\theta$-associated stable subspace would be finding the span of the $e^{i\theta}$ and $e^{-i\theta}$ eigenspaces of the complexification $V\otimes_{\mathbb{R}}\mathbb{C}$ and intersecting with $V$. (I don't really think about linear algebra from the practical side though, so this may be unhelpful.)

Related Question