$A,B \in SO(3) \implies A^T = A^{-1}, \det(A) = 1 \ etc.\ \implies \det(AB) = \det(A)\det(B) = 1$ and $ (AB)^{-1} = B^{-1} A^{-1} = B^T A^T = (AB)^T$. So you're done.
You want to find $v$ such that $Av = v$. That's called an eigenvector of $A$ of eigenvalue $1$. And to find those you observe that
$Av = \lambda v \iff (A - \lambda I) v = 0$. Now if $(A - \lambda I)$ is invertible then the only eigenvector is $0$, so you want to find where it's non-invertible (singular) and that is precisely when $\det(A-\lambda I) = 0$. So you need to solve for the roots o a polynomial called the characteristic polynomial and given by $\det(A - \lambda I) = 0$. Then you want to show that one of those roots is $1$.
Equivalently you can show that $(A-I)$ is non-invertible. Notice that the expression $(I-A)^{-1}$ looks alot like the formula for summing a geometric series $\sum_{k\geq 1} r^k$. Consider the geometric series $I + A + A^2 + \dots$. If it converges then so does $C + CA + C(A^2) + \dots$ for all square matrices $C$. Prove that.
Then in the same way you find the geometric series formula in a field, we have :
$$
S = \sum_{k \geq 0} A^{k} \\
\implies \\
AS = S - I \\
\implies \\
S(A - I) = I \\
\implies \\
S = (A - I)^{-1}
$$
So if the summation converges, then $(A - I)$ is invertible. The contrapositive is that if $(A-I)$ is non-invertible, then the summation diverges. Unfortunately we need more than that, we need that if the summation diverges then $(A-I)$ is non-invertible.
Let $A = \begin{pmatrix} 1 & 0 & 0 \\ 0 & e^{i\theta} & \dots \\ 0 & \vdots & \ddots \end{pmatrix}$. I.e. the vector $v$ fixed by $A$ is $\hat{x} = (1,0,0)^T$. So it's rotation around the $x$-axis.
If $\theta = \pi$ in the above then $A = \begin{pmatrix} 1 & 0 & 0 \\ 0 &-1 & 0 \\ 0 & 0 & -1 \end{pmatrix}$. Then $I + A + A^2 + \dots $ diverges as it's composed of diverging entries. Let $S(X) = I + A + A^2 + \dots$. Show that $S(A^k)$ diverges for any $k \in \Bbb{Z}$.
Let $H = \langle A \rangle = \{\dots, A^{-2}, A^{-1}, I, A, A^2, \dots \}$ be the subgroup of $G = SO(3)$ generated by $A$. The conjugacy classes of a subgroup $H$ partition a group $G$, so for all $B \in SO(3)$ we can write $B = C A^k C^{-1}$ for some $k \in \Bbb{Z}$, $C \in SO(3)$. Then does $S(B)$ converge or diverge? If it converges, then $S(B) = S(C A^k C^{-1}) = I + C A^k C^{-1} + C(A^k)^2 C^{-1} + \dots$. But if that converges then so does
$$
C^{-1}S(B)C = I + C^{-1}C A^k C^{-1} C + C^{-1}C (A^k)^2 C^{-1} C + \dots = S(A^k)
$$, a contradiction.
Thus $S(B)$ diverges for all $B \in SO(3)$.
To finish the proof you may want to check that since $S(B)$ diverges for each $B \in SO(3)$, the eigenvalue $1$ exists for each $B$.
Best Answer
For a matrix $M$, define $M_{ij}$ to be its $(i,j)^{th}$ entry, and $M_i$ to be its $i^{th}$ row.
What you want to show is
And by definition of the regular matrix multiplication,
Since $A_iB$ has only one row, you'll just have to show that $(1)$ and $(2)$ are equal for each $j$.