Others have raised some good points, and a definite answer really depends what kind of a linear transformation do we want call a rotation or a reflection.
For me a reflection (may be I should call it a simple reflection?) is a reflection with respect to a subspace of codimension 1. So in $\mathbf{R}^n$ you get these by fixing a subspace $H$ of dimension $n-1$. The reflection $s_H$ w.r.t. $H$ keeps the vectors of $H$ fixed (pointwise) and multiplies a vector perpendicular to $H$ by $-1$. If $\vec{n}\perp H$, $\vec{n}\neq0$, then $s_H$ is given by the formula
$$\vec{x}\mapsto\vec{x}-2\,\frac{\langle \vec{x},\vec{n}\rangle}{\|\vec{n}\|^2}\,\vec{n}.$$
The reflection $s_H$ has eigenvalue $1$ with multiplicity $n-1$ and eigenvalue $-1$ with multiplicity $1$ with respective eigenspaces $H$ and $\mathbf{R}\vec{n}$. Thus its determinant is $-1$. Therefore geometrically it reverses orientation (or handedness, if you prefer that term), and is not a rigid body motion in the sense that in order to apply that transformation to a rigid 3D body, you need to break it into atoms (caveat: I don't know if this is the standard definition of a rigid body motion?). It does preserve lengths and angles between vectors.
Rotations (by which I, too, mean simply an orthogonal transformations with $\det=1$) have more variation. If $A$ is a rotation matrix, then Adam's calculation proving that the lengths are preserved, tells us that the eigenvalues must have absolute value $=1$ (his calculation goes through for a complex vectors and the Hermitian inner product). Therefore the complex eigenvalues are on the unit circle and come in complex conjugate pairs. If $\lambda=e^{i\varphi}$ is a non-real eigenvalue, and $\vec{v}$ is a corresponding eigenvector (in $\mathbf{C}^n$), then the vector $\vec{v}^*$ gotten by componentwise complex conjugation is an eigenvector of $A$ belonging to eigenvalue $\lambda^*=e^{-i\varphi}$.
Consider the set $V_1$ of vectors of the form $z\vec{v}+z^*\vec{v}^*$. By the eigenvalue property this set is stable under $A$:
$$A(z\vec{v}+z^*\vec{v}^*)=(\lambda z)\vec{v}+(\lambda z)^*\vec{v}^*.$$ Its components are also stable under complex conjugation, so $V_1\subseteq\mathbf{R}^n$. It is obviously a 2-dimensional subspace, IOW a plane. It is easy to guess and not difficult to prove that the restriction of the transformation $A$ onto the subspace $V_1$ is a rotation by the angle $\varphi_1=\pm\varphi$. Note that we cannot determine the sign of the rotation (clockwise/ccw), because we don't have a preferred handedness on the subspace $V$.
The preservation of angles (see Adam's answer) shows that $A$ then maps the $n-2$ dimensional subspace $V^\perp$ also to itself. Furthermore, the determinant of $A$ restricted to $V_1$ is equal to one, so the same holds for $V_1^\perp$. Thus we can apply induction and keep on splitting off 2-dimensional summands $V_i,i=2,3\ldots,$ such that on each summand $A$ acts as a rotation by some angle $\varphi_i$ (usually distinct from the preceding ones). We can keep doing this until only real eigenvalues remain, and end with the situation:
$$
\mathbf{R}^n=V_1\oplus V_2\oplus\cdots V_m \oplus U,
$$
where the 2D-subspaces $V_i$ are orthogonal to each other, $A$ rotates a vector in $V_i$ by the angle $\varphi_i$, and $A$ restricted to $U$ has only real eigenvalues.
Counting the determinant will then show that the multiplicity of $-1$ as an eigenvalue of $A$ restricted to $U$ will always be even. As a consequence of that we can also split that eigenspace into sums of 2-dimensional planes, where $A$ acts as rotation by 180 degrees (or multiplication by $-1$). After that there remains the eigenspace belonging to eigenvalue $+1$. The multiplicity of that eigenvalue is congruent to $n$ modulo $2$, so if $n$ is odd, then $\lambda=+1$ will necessarily be an eigenvalue. This is the ultimate reason, why a rotation in 3D-space must have an axis = eigenspace belonging to eigenvalue $+1$.
From this we see:
- As Henning pointed out, we can continuously bring any rotation back to the identity mapping simply by continuously scaling all the rotation angles $\varphi_i,i=1,\ldots,m$ continuously to zero. The same can be done on those summands of $U$, where $A$ acts as rotation by 180 degrees.
- If we want to define rotation in such a way that the set of rotations contains the elementary rotations described by Henning, and also insist that the set of rotations is closed under composition, then the set must consist of all orthogonal transformations with $\det=1$. As a corollary to this rotations preserve handedness. This point is moot, if we defined a rotation by simply requiring the matrix $A$ to be orthogonal and have $\det=1$, but it does show the equivalence of two alternative definitions.
- If $A$ is an orthogonal matrix with $\det=-1$, then composing $A$ with a reflection w.r.t. to any subspace $H$ of codimension one gives a rotation in the sense of this (admittedly semi-private) definition of a rotation.
This is not a full answer in the sense that I can't give you an 'authoritative' definition of an $n$D-rotation. That is to some extent a matter of taste, and some might want to only include the simple rotations from Henning's answer that only "move" points of a 2D-subspace and keep its orthogonal complement pointwise fixed. Hopefully I managed to paint a coherent picture, though.
For the case where it's over $\mathbb R^2$ it's quite trivial. The images of $e_x$ and $e_y$ is the columns of the matrix. According to the requirement these has to be of unit length and orthogonal - therefor we have the requirement that the matrix has to be orthonormal.
In addition to preserve orientation we have that the images of $e_y$ has to be the image of $e_x$ rotated $\pi/2$ counter clock wise (given positively oriented coordinate system). That is if the image of $e_x$ is $(u,v)$ the image of $e_y$ has to be $(-v, u)$ which means that the matrix will have positive determinant.
The reverse can as easily be seen as being true.
For higher dimension you will eventually run into the very definition of orientation being dependent on the determinant. However one could define rotations as images composed of "trivial" rotations around the axes for example, in this case one can see in similar way that we will end up with exactly those matrices as well.
Best Answer
I had to work a bit on it, and with the help of other answers on this board I came up with some answers:
$I$ is symetric. The dot product §vv^{\top}§ gives a symmetric matrix as result. The $2$ in front is irrelevant. It's a mutpile so that still counts. It is $(AB)^{\top} = B^{\top} A^{\top} $, so $I^{\top} = I$ and $(vv^{\top})^{\top} = vv^{\top}$. And with that $H^{\top} = (I- vv^{\top})^{\top} = I^{\top} - (vv^{\top})^{\top} = I - vv^{\top} = H$. $H$ is therefor symmetrical.
counter example matrix [-1 0 0; 0 -1 0; 0 0 -1]
Let $u,v \in \mathbb{R^n}$ be colmn vectors, their dot product is then $\langle u,v \rangle = u^{\top} \cdot v$. Let $A$ be a orthogonal Matrix, then:
$\langle A \cdot u,A \cdot v \rangle = (A \cdot u)^{\top} \cdot (A \cdot v) = u^{\top} \cdot A^{\top} \cdot A \cdot v = u^{\top} \cdot A^{-1} \cdot A \cdot v = u^{\top} \cdot v$ This shows that the transformation keeps the dot product. no change of angles is happening. This is in general the product of reflections. Bonus: Every reflection can be made with a series of householder reflections. So it is about householder reflections and matrices, too.