[Math] Show any orthogonal matrix is similar to an almost diagonal matrix, with either $\pm 1$ or a 2D rotation on the diagonal

linear algebramatricesorthogonal matricesorthogonality

Let $A \in O(n).$ Show that $A$ is similar to a matrix which consists of $2 \times 2$ blocks down the diagonal of the form
$$ \begin{pmatrix} \cos{\theta} & \sin{\theta}\\-\sin{\theta} & \cos{\theta} \end{pmatrix},$$
along with some diagonal elements which are $+1$ and $-1.$

For example, maybe $A \in O(5)$ is similar to,
$$\begin{pmatrix}
1 & 0 & 0 & 0 & 0\\
0 & \cos{\theta} & \sin{\theta} & 0 & 0\\
0 & -\sin{\theta} & \cos{\theta} & 0 & 0\\
0 & 0 & 0 & 1 & 0\\
0 & 0 & 0 & 0 & -1
\end{pmatrix}.$$

This makes intuitive sense to me, though I am not sure how I'd go about the proof. I've been trying to use a subspace invariant method, like $W$ is a subspace of $\mathbb{R^n}$ that is $A$-invariant, then its orthogonal complement is too, but it hasn't gotten me much closer. Any hints/solutions?

Best Answer

I think it is more comfortable to work with operators instead of matrices since the induction requires you to restrict the associated map to an invariant subspace.

Let $(V,\left< \cdot, \cdot \right>)$ be a real finite dimensional inner product space and let $T \colon V \rightarrow V$ be an orthogonal operator. Like you suggested, show first that if $W \leq V$ is $T$-invariant then so is $W^{\perp}$ and $T|_{W^{\perp}}$ (with the induced inner product) is also orthogonal. Since the characteristic polynomial of $T$ splits into linear and quadratic irreducible factors, $T$ has either a one-dimensional or a two-dimensional invariant subspace. By induction, you see that $V$ has a direct sum orthogonal decomposition $V = V_1 \oplus \dots \oplus V_k \oplus W_1 \dots \oplus W_l$ where each $V_i$ is one-dimensional $T$-invariant subspace and $W_j$ is a two-dimensional $T$-invariant subspace. Each $V_i$ corresponds to an eigenvector of $T$ and since $T$ is orthogonal, the corresponding eigenvalue is $\pm 1$. Each $T|_{W_j}$ is a two dimensional orthogonal transformation so it is enough to show that each two dimensional orthogonal transformation can be represented with respect to an orthonormal basis (in fact, any orthonormal basis) by a matrix that has one of the forms

$$ \begin{pmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{pmatrix}, \begin{pmatrix} -1 & 0 \\ 0 & 1 \end{pmatrix}. $$

Assume $T \colon V \rightarrow V$ is orthogonal and $\dim V = 2$. Let $\mathcal{B} = (v_1,v_2)$ be an orthonormal basis of $V$. Then we have

$$ [T]_{\mathcal{B}} = \begin{pmatrix} \left< Tv_1, v_1 \right> & \left< Tv_2, v_1 \right> \\ \left <Tv_1, v_2 \right> & \left <Tv_2, v_2 \right> \end{pmatrix}. $$

Since $T$ is orthogonal, we have

$$ 1 = ||v_2||^2 = ||Tv_2||^2 = \left< Tv_2, v_1 \right>^2 + \left< Tv_2, v_2 \right>^2 $$

and so we can write $\left< Tv_2, v_2 \right> = \cos \theta, \left< Tv_2, v_1 \right> = \sin \theta$ for a unique $\theta \in [0,2\pi)$. Since the first column is orthogonal to the second column (with respect to the standard inner product) and of length one, the matrix has one of the forms

$$ \begin{pmatrix} \cos \theta & \sin \theta \\ - \sin \theta & \cos \theta \end{pmatrix}, \begin{pmatrix} -\cos \theta & \sin \theta \\ \sin \theta & \cos \theta \end{pmatrix} $$

(depending on whether $\det T = 1$ or $\det T = -1$). In the second case, the characteristic polynomial of the operator is $x^2 - 1 = (x - 1)(x + 1)$ and so it is diagonalizable with eigenvalues $1,-1$ whose associated eigenvectors are orthogonal (as $T$ is unitary).