[Math] Orthogonal matrices, their determinant and eigenvalues

eigenvalues-eigenvectorslinear algebraproof-verification

Let

$$A = \begin{pmatrix} \cos(a) & -\sin(a) & 0 \\ \sin(a) & \cos(a) & 0 \\ 0 & 0 & 1 \end{pmatrix},\\
B = \begin{pmatrix} \cos(b) & 0 & -\sin(b) \\ 0 & 1 & 0 \\ \sin(b) & 0 & \cos(b) \end{pmatrix}.$$

What to do? Find eigenvalues for $A$, $B$, $AB$, $BA$.

How I want to do this:

(1) I calculated the eigenvalues of both $A$ and $B$ by $\det(A – \lambda E_n) = 0$.

If I have done it correctly, then $eig(A)=eig(B)= \{-1,1\}$.

(2) Both $A$ and $B$ are orthogonal, so

$A \cdot A^t = I = B \cdot B^t $.

Thus
$(AB)\cdot(AB)^t = A\cdot B \cdot B^t \cdot A^t = A \cdot I \cdot A^t = I$.

Thus $AB$ and $BA$ are orthogonal.

(3)
Furthermore, for any orthogonal matrices $R$ it is
$1 = \det(I) = \det(R\cdot R^t) = \det(R) \cdot \det(R^t) = \det(R)^t \rightarrow \det(R) = \pm 1$.

(4) Finally:
$\det(A) = \det(B) = \det(AB) = \det(BA) = \pm 1$ so all matrices $A$, $B$, $AB$ and $BA$ have eigenvalues $-1$, $1$.

Best Answer

You can do this two ways.

The first way is to compute the eigenvalues directly, by factoring the characteristic polynomial. As Dustan points out in his comment, you have made some error while doing this, as you should not get $-1, 1$ as two of your eigenvalues.

The second way is to observe some facts about orthogonal matrices $R$. Perhaps it's no easier than performing those $3\times 3$ determinants and factoring the cubics, but to me it still has more appeal.

  1. $\det(R) = \pm 1$, since $1 = \det(I) = \det(R^TR) = \det(R)^2$, and in fact $\det(A)=\det(B)=1$ for your matrices $A,B$.

  2. All eigenvalues of $R$ have modulus (complex norm) 1. This is because for any eigenvector $v$ of $R$, $$\|\lambda\|\|v\| = \|Rv\| = \sqrt{v^*R^*Rv} = \sqrt{v^*R^TRv} = \sqrt{v^*v} = \|v\|.$$

  3. $R$ has a full set of orthogonal eigenvectors, since $R$ is normal.

  4. If $R$ has all real eigenvalues, $R$ is symmetric; equivalently if $R$ is not symmetric, $R$ must have some complex eigenvalues.

  5. Complex eigenvalues of real matrices must come in complex conjugate pairs.

  6. If $R$ is a rotation in $\mathbb{R}^3$, it has at least one eigenvector with eigenvalue 1: this is because every rotation matrix in 3D fixes one axis. It also follows algebraically from facts 1, 2, and 5.


So now for each of your matrices, you know one eigenvalue (1) and need to find the other two. None of $A, B, AB,$ or $BA$ are symmetric, so they are complex conjugates $u\pm vi$, with modulus 1: $v = \sqrt{1-u^2}.$

The trace of each matrix $M$ must equal $1+u+vi + u-vi$, so $u = \frac{\operatorname{tr}(M)-1}{2}$. For each of your three matrices, compute their trace, compute $u$, compute $v$, and then you have your three eigenvalues.

Related Question