[Math] How to show that if a 3×3 matrix A and it’s diagonal B are in SO(3) that their product is in SO(3)

linear algebramatrices

I can easily determine part a and c but am stuck when it comes to part b and d. Any help would be greatly appreciated. I apologize for any formatting errors I am new to math.stackexchange.com I promise I tried to do my best to make it readable for all you wonderful people! Looks like this was asked here: https://math.stackexchange.com/questions/577732/a-couple-questions-about-a-3-by-3-matrix-a but even with Any's answer I am unable to understand and complete parts b and d. Without further dilly-daddling here is the question:

Let SO(3) denote the set of 3×3 matrices A such that A has real entries, $det(A)=1$, and $A^T$= $A^{-1}$.

(a) Show that the matrix A is in SO(3):
$$A = \begin{pmatrix}
1 & 0 & 0 \\
0 & \cos(x) & -\sin(x) \\
0 & \sin(x) & cos(x)
\end{pmatrix}$$
This matrix can be described as "a rotation of $R^3$ with axis v = $[1,0,0]^T$. Let v be any unit length vector in $R^3$ and let F = [v, v', v''] be any ordered orthonormal basis for $R^3$ containing this vector v. Let P = [v v' v''] be the transition matrix from F to the standard basis. Show that for the matrix A above, if we define $B = PAP^{-1}$ then, B is in SO(3) and $Bv = v$

(b) Show that if $A$ and $B$ are in $SO(3)$, then their product $AB$ is also in $SO(3)$

(c) Show that if A is a 3×3 matrix, then $det(-A) = -det(A)$

(d) Show that if $A$ is in $SO(3)$ then there is a vector $v$ in $R^3$ such that $Av = v$. To do this, you need to show that $A – I$ is singular. Consider the determinant of $A – I$. Use the fact $A^T = A^{-1}$.

Work attempted so far:

a) I've easily determined the determinant of A (hehe) to be 1 using the fact that $cos^2(\theta) + sin^2(\theta)$ = 1 and I have computed $A^T$ and verified that it is equal to $A^{-1}$.
Since A contains all real values it meets the criteria of being in SO(3) part (a) is completed without issue.

b) I am not sure if I can do this… but since we are given $B = PAP^{-1}$ can $AB$ be rewritten as $APAP^{-1}$ and furthermore rewritten $A^2PP^{-1}$ and since $PP^{-1} = I$ can AB be rewritten as $A^2I$? If so proving that $A^2I$ is in SO(3) is rather trivial and I don't need any further help with this part of the question just want to make sure I am approaching the question correctly.

c)This is easily done by taking the opposite of all the values in matrix A and checking if the value is equal to the opposite of det(A). As long as this is the correct approach I don't need further help.

d) $$A-I = \begin{pmatrix}
0 & 0 & 0 \\
0 & \cos(x)-1 & -\sin(x) \\
0 & \sin(x) & cos(x)-1
\end{pmatrix}$$
Since top row is all 0's det(A-I)= 0 so we have shown the matrix A-I is singular. We have already shown in part (a) that $A^T = A^{-1}$ but I get stuck trying to show there is a vector v in $R^3$ such that $Av=v$ any suggestions/starting points for this question would be a huge help.

Best Answer

$A,B \in SO(3) \implies A^T = A^{-1}, \det(A) = 1 \ etc.\ \implies \det(AB) = \det(A)\det(B) = 1$ and $ (AB)^{-1} = B^{-1} A^{-1} = B^T A^T = (AB)^T$. So you're done.

You want to find $v$ such that $Av = v$. That's called an eigenvector of $A$ of eigenvalue $1$. And to find those you observe that

$Av = \lambda v \iff (A - \lambda I) v = 0$. Now if $(A - \lambda I)$ is invertible then the only eigenvector is $0$, so you want to find where it's non-invertible (singular) and that is precisely when $\det(A-\lambda I) = 0$. So you need to solve for the roots o a polynomial called the characteristic polynomial and given by $\det(A - \lambda I) = 0$. Then you want to show that one of those roots is $1$.

Equivalently you can show that $(A-I)$ is non-invertible. Notice that the expression $(I-A)^{-1}$ looks alot like the formula for summing a geometric series $\sum_{k\geq 1} r^k$. Consider the geometric series $I + A + A^2 + \dots$. If it converges then so does $C + CA + C(A^2) + \dots$ for all square matrices $C$. Prove that.

Then in the same way you find the geometric series formula in a field, we have :

$$ S = \sum_{k \geq 0} A^{k} \\ \implies \\ AS = S - I \\ \implies \\ S(A - I) = I \\ \implies \\ S = (A - I)^{-1} $$

So if the summation converges, then $(A - I)$ is invertible. The contrapositive is that if $(A-I)$ is non-invertible, then the summation diverges. Unfortunately we need more than that, we need that if the summation diverges then $(A-I)$ is non-invertible.

Let $A = \begin{pmatrix} 1 & 0 & 0 \\ 0 & e^{i\theta} & \dots \\ 0 & \vdots & \ddots \end{pmatrix}$. I.e. the vector $v$ fixed by $A$ is $\hat{x} = (1,0,0)^T$. So it's rotation around the $x$-axis.

If $\theta = \pi$ in the above then $A = \begin{pmatrix} 1 & 0 & 0 \\ 0 &-1 & 0 \\ 0 & 0 & -1 \end{pmatrix}$. Then $I + A + A^2 + \dots $ diverges as it's composed of diverging entries. Let $S(X) = I + A + A^2 + \dots$. Show that $S(A^k)$ diverges for any $k \in \Bbb{Z}$.

Let $H = \langle A \rangle = \{\dots, A^{-2}, A^{-1}, I, A, A^2, \dots \}$ be the subgroup of $G = SO(3)$ generated by $A$. The conjugacy classes of a subgroup $H$ partition a group $G$, so for all $B \in SO(3)$ we can write $B = C A^k C^{-1}$ for some $k \in \Bbb{Z}$, $C \in SO(3)$. Then does $S(B)$ converge or diverge? If it converges, then $S(B) = S(C A^k C^{-1}) = I + C A^k C^{-1} + C(A^k)^2 C^{-1} + \dots$. But if that converges then so does $$ C^{-1}S(B)C = I + C^{-1}C A^k C^{-1} C + C^{-1}C (A^k)^2 C^{-1} C + \dots = S(A^k) $$, a contradiction.

Thus $S(B)$ diverges for all $B \in SO(3)$.

To finish the proof you may want to check that since $S(B)$ diverges for each $B \in SO(3)$, the eigenvalue $1$ exists for each $B$.

Related Question