Yes. Quoting Halmos's Linear algebra problem book (Solution 160).
“If $A$ and $B$ are real, $U$ is unitary, and $U^*AU = B$, then there exists a real orthogonal $V$ such that $V^*AV = B$.
A surprisingly important tool in the proof is the observation that the unitary equivalence of $A$ and $B$ via $U$ implies the same result for $A^*$ and $B^*$. Indeed, the adjoint of the assumed equation is $U^*A^*U = B^*$.
Write $U$ in terms of its real and imaginary parts $U = E + i F$. It follows from $AU = UB$ that $AE = EB$ and $AF = FB$, and hence that $A(E+\lambda F) = (E+\lambda F)B$ for every scalar $\lambda$. If $\lambda$ is real and different from a finite number of troublesome scalars (the ones for which $\det(E+\lambda F) = 0$), the real matrix $S = E + \lambda F$ is invertible, and, of course, has the property that $AS=SB$.
Proceed in the same way from $U^*A^*U = B^*$: deduce that $A^*(E+\lambda F) = (E+\lambda F)B^*$ for all $\lambda$, and, in particular, for the ones for which $E+\lambda F$ is invertible, and infer that $A^*S = SB^*$ (and hence that $S^*A = BS^*$).
Let $S =VP$ be the polar decomposition of $S$ (that theorem works just as well in the real case as in the complex case, so that $V$ and $P$ are real.) Since $$BP^2 = BS^*S = S^*AS = S^*SB = P^2B,$$ so that $P^2$ commutes with $B$, it follows that $P$ commutes with $B$. Since $$AVP = AS = SB = VPB = VBP$$ and $P$ is invertible, it follows that $AV=VB$, and the proof is complete.”
Needless to say, that isn't the shortest path to prove the reduction of antisymmetric matrices...
I think it is more comfortable to work with operators instead of matrices since the induction requires you to restrict the associated map to an invariant subspace.
Let $(V,\left< \cdot, \cdot \right>)$ be a real finite dimensional inner product space and let $T \colon V \rightarrow V$ be an orthogonal operator. Like you suggested, show first that if $W \leq V$ is $T$-invariant then so is $W^{\perp}$ and $T|_{W^{\perp}}$ (with the induced inner product) is also orthogonal. Since the characteristic polynomial of $T$ splits into linear and quadratic irreducible factors, $T$ has either a one-dimensional or a two-dimensional invariant subspace. By induction, you see that $V$ has a direct sum orthogonal decomposition $V = V_1 \oplus \dots \oplus V_k \oplus W_1 \dots \oplus W_l$ where each $V_i$ is one-dimensional $T$-invariant subspace and $W_j$ is a two-dimensional $T$-invariant subspace. Each $V_i$ corresponds to an eigenvector of $T$ and since $T$ is orthogonal, the corresponding eigenvalue is $\pm 1$. Each $T|_{W_j}$ is a two dimensional orthogonal transformation so it is enough to show that each two dimensional orthogonal transformation can be represented with respect to an orthonormal basis (in fact, any orthonormal basis) by a matrix that has one of the forms
$$ \begin{pmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{pmatrix}, \begin{pmatrix} -1 & 0 \\ 0 & 1 \end{pmatrix}. $$
Assume $T \colon V \rightarrow V$ is orthogonal and $\dim V = 2$. Let $\mathcal{B} = (v_1,v_2)$ be an orthonormal basis of $V$. Then we have
$$ [T]_{\mathcal{B}} = \begin{pmatrix} \left< Tv_1, v_1 \right> & \left< Tv_2, v_1 \right> \\ \left <Tv_1, v_2 \right> & \left <Tv_2, v_2 \right> \end{pmatrix}. $$
Since $T$ is orthogonal, we have
$$ 1 = ||v_2||^2 = ||Tv_2||^2 = \left< Tv_2, v_1 \right>^2 + \left< Tv_2, v_2 \right>^2 $$
and so we can write $\left< Tv_2, v_2 \right> = \cos \theta, \left< Tv_2, v_1 \right> = \sin \theta$ for a unique $\theta \in [0,2\pi)$. Since the first column is orthogonal to the second column (with respect to the standard inner product) and of length one, the matrix has one of the forms
$$ \begin{pmatrix} \cos \theta & \sin \theta \\ - \sin \theta & \cos \theta \end{pmatrix}, \begin{pmatrix} -\cos \theta & \sin \theta \\ \sin \theta & \cos \theta \end{pmatrix} $$
(depending on whether $\det T = 1$ or $\det T = -1$). In the second case, the characteristic polynomial of the operator is $x^2 - 1 = (x - 1)(x + 1)$ and so it is diagonalizable with eigenvalues $1,-1$ whose associated eigenvectors are orthogonal (as $T$ is unitary).
Best Answer
You could go with a real Schur decomposition, which is a constructive proof. There is one here: it constructs a single $1 \times 1$ or $2 \times 2$ diagonal block and then goes on inductively.
After you do that, use the orthogonality to show that $T$ (the triangular factor from the Schur decomposition) is orthogonal quasidiagonal (it has blocks of order at most $2$).
Next, it is easy to show that blocks themselves must be orthogonal. The blocks of order $1$ are trivial. As for the blocks of order $2$, go with the most general form: $$U = \begin{bmatrix} u_{11} & u_{12} \\ u_{21} & u_{22} \end{bmatrix}.$$ Now, use that $U^TU = {\rm I}_2$ and $UU^T = {\rm I}_2$, so \begin{align*} \begin{bmatrix} u_{11}^2 + u_{21}^2 & u_{11} u_{12} + u_{21} u_{22} \\ u_{11} u_{12} + u_{21} u_{22} & u_{12}^2 + u_{22}^2 \end{bmatrix} = \begin{bmatrix} 1 \\ & 1 \end{bmatrix}, \\ \begin{bmatrix} u_{11}^2 + u_{12}^2 & u_{11} u_{21} + u_{12} u_{22} \\ u_{11} u_{21} + u_{12} u_{22} & u_{21}^2 + u_{22}^2 \end{bmatrix} = \begin{bmatrix} 1 \\ & 1 \end{bmatrix}. \end{align*} From the diagonal elements, you see right away that $u_{12}^2 = u_{21}^2$. Since all is real, $u_{12} = \pm u_{21}$. Then use the nondiagonal elements to get that $u_{11} = \mp u_{22}$. For now, assume that $u_{11} = u_{22} =: c$, so $u_{21} = -u_{12} =: s$ and quickly show that $c$ and $s$ are sine and cosine of some angle $\varphi$.
What is left is the case when $u_{11} = -u_{22}$ (so $u_{12} = u_{21}$). In this case, just compute the eigenvalues (you have the formulas here) and you'll see that they are $1$ and $-1$, which is a contradiction with the fact (from the construction of the real Schur decomposition) that the blocks of order $2$ have complex conjugate pairs of eigenvalues.