In two dimension the result you quoted is false as stated: the matrix
$$ \begin{pmatrix} -1 & 0 \\ 0 & -1\end{pmatrix} = \begin{pmatrix} \cos \pi & \sin \pi \\ -\sin \pi & \cos \pi \end{pmatrix} $$
is a rotation matrix. And every vector is an eigenvector.
It is true, however, if you explicitly disallow this particular case.
In three dimensions, note that since rotations preserves vector norms, you have that if $v$ is an eigenvector of a rotation $A$ you must have $Av = \pm v$. Now supposing you have two linearly independent eigenvectors $v$ and $w$. Let $u$ be the unique vector orthogonal to $v$ and $w$. Then you have
$$ (Au)^Tv = u^T A v = u^T (\pm v) = 0 $$
and similarly
$$ (Au)^Tw = 0 $$
and hence you get that $Au$ is proportional to $u$, and hence you must have $u$ is an eigenvector also.
This means that if a rotation matrix has more than 1 eigendirections, it must have a set of three linearly independent eigendirections.
The answer is yes, since the exponential map $\exp: \mathbf{so}(3) \rightarrow \mathbf{SO}(3)$ is surjective (=onto).
Long answer:
Axis-angle can be represented using a $3$-vector $\omega$ while the magnitude $\theta=|\omega|$ is the rotation angle and $\mathbf{u}=^\omega/_\theta$ is the rotation axis. 3-Vectors are closed under the cross product:
$$\omega_1\in \mathbb{R}^3, \omega_2\in \mathbb{R}^3\Rightarrow (\omega_1\times \omega_2)\in\mathbb{R}^3.$$
Each such vector $\omega$ has an equivalent $3\times 3$ matrix representation
$\hat{\omega}$ (which is uniquely defined by $\hat{\omega}\cdot \mathbf{a} := \omega\times \mathbf{a}$ for $\mathbf{a}$ being a general 3-vector).
The space of matrices of the form $\hat{\omega}$ is called the Lie algebra $\mathbf{so}(3)$. Thus, one can show that matrices of the form $\hat{\omega}$ are closed under the Lie bracket $[A,B]=AB-BA$:
$$\hat{\omega_1}\in \mathbf{so}(3), \hat{\omega_2}\in \mathbf{so}(3)\Rightarrow [\hat{\omega}_1, \hat{\omega}_2]\in\mathbf{so}(3).$$
Now, let us consider the matrix exponential: $\exp(\mathtt{A}) = \sum_{i=0}^\infty \frac{\mathtt{A}^i}{i!} $. Two poperties can be shown:
(1) If $\hat{\omega}\in\mathbf{so}(3)$, then $\exp(\hat{\omega})\in\mathbf{SO}(3)$.
$\mathbf{SO}(3)$ is the special orthogonal group in three dimensions. Thus, it consists of matrices which are orthogonal ($\mathtt{R}\cdot \mathtt{R}^\top=\mathtt{I}$) and the determinant is 1. In other word, it is the group of pure rotations.
(2) The exponential map $\exp: \mathbf{so}(3) \rightarrow \mathbf{SO}(3)$ is surjective.
So, (1) says that every $\exp(\hat{\omega})$ is a rotation matrix. And, (2) says that for each rotation matrix $\mathtt{R}$, there is at least one axis-angle
representation $\omega$ so that $\exp(\hat{\omega})=\mathtt{R}$
Proofs of (1) and (2) are in corresponding text books, e.g. [Gallier, page 24].
Best Answer
Leaving aside adjoined vector cases, things about 3×3 matrices over real space is like this.
Every matrix has a characteristic polynomial of 3rd order. This polynomial has 3 roots (eigenvalues). Due to the main theorem of algebra, either all of them are real, or one of them is real and other are 2 conjugated complex numbers.
In the first case, each of the 3 real eigenvalues $\lambda_i$ has corresponding eigenvector $v_i$ (direction). These directions are perpendicular, and transformation can be seen as scaling in each of the direction by the corresponding $\lambda_i$.
In case of only one real eigenvector $\lambda_1$, there only one corresponding eigenvector $v_1$ and for the pair of complex eigenvectors $\lambda_{2,3}$, there is a corresponding plane $p$ (perpendicular to the aforementioned eigenvector). The transformation is scaling by $\lambda_1$ in the direction of $v_1$, scaling by $|\lambda_{2,3}|$ in the plane $p$ and rotation around $v_1$ by angle $\arg \lambda_{2,3}$.