It seems to me that this problem comes down to getting the definitions straight. So here they are:
Symmetric matrix: a matrix $A$ is called symmetric when $A=A^T$. If $A$ is a matrix of real entries, then we can say that $A$ is an instance of a "Hermitian" matrix
Hermitian matirix a matrix $A$ is Hermitian when $A=A^*$, that is, $A$ is equal to its own conjugate transpose.
Depending on the context, real symmetric matrices and complex Hermitian matrices might be referred to as self-adjoint.
Orthogonal matrix: a matrix $U$ is called orthogonal when $UU^T=1$. If $U$ is a matrix of real entries, then we can say that $U$ is an instance of a "unitary" matrix
Unitary matrix a matrix $U$ is called unitary when $UU^*=1$. That is, the inverse of $U$ is its conjugate transpose.
The spectral theorem could then be understood as follows:
Suppose that $A$ is any normal matrix (that is, $AA^*=A^*A$). We can then find a matrix $P$ such that $PAP^*$ is diagonal.
If $A$ happens to be a real and symmetric matrix, then we can find a $P$ that is not only unitary, but also itself real. That is, there is a real matrix $P$ such that $PAP^*$ is diagonal. Since $P$ is real, this is the same as saying $PAP^T$ is diagonal.
There seems a little confusion in what you have written. A square matrix with real entries can have complex eigenvalues, for example. So you need to ask yourself, before you start solving your problem, which field you are working over. For example, consider
$[ -1 \space{} -2; 2 \space{} 1]$. This matrix has two complex eigenvalues.
Over the complex numbers, every real/complex square matrix is similar to a triangular matrix (Jordan Normal Form). On the other hand, not every square matrix, real or complex, is diagonalisable (whether you consider the solutions over reals or complex numbers).
And then, as you point out, an orthogonal matrix is always also unitary (converse is not true and easy examples are available).
Best Answer
We can show it by induction on the dimension. For $n=2$, let $A=\begin{pmatrix}a&0\\\ b&c\end{pmatrix}$ such a matrix. Then \begin{align*}A^* A-AA^* &=\begin{pmatrix}\bar a&\bar b\\ 0&\bar c\end{pmatrix}\begin{pmatrix}a&0\\ b&c\end{pmatrix}-\begin{pmatrix}a&0\\ b&c\end{pmatrix}\begin{pmatrix}\bar a&\bar b\\ 0&\bar c\end{pmatrix}\\ &=\begin{pmatrix}|a|^2+|b|^2-|a|^2&\bar bc-a\bar b\\ \bar cb-b\bar a&|c|^2-(|b|^2+|c|^2)\end{pmatrix}\\ &=\begin{pmatrix}|b|^2&\bar bc-a\bar b\\ \bar cb-b\bar a&-|b|^2\end{pmatrix}, \end{align*} so $b=0$ and $A$ is diagonal. If the result is true for $n\geq 2$, let $A=\begin{pmatrix}T&0\\\ v&a\end{pmatrix}$, where $T$ is a $n\times n$ triangular matrix, $v$ a $1\times n$ matrix and $a$ a complex number. Since $A^*A=AA^*$, we have \begin{align*} \begin{pmatrix}0&0\\ 0&0\end{pmatrix}&=\begin{pmatrix}T^* &v^* \\ 0&\bar a\end{pmatrix}\begin{pmatrix}T&0\\ v&a\end{pmatrix}-\begin{pmatrix}T&0\\ v&a\end{pmatrix}\begin{pmatrix}T^* &v^* \\ 0&\bar a\end{pmatrix}\\ &=\begin{pmatrix}T^* T+v^* v &v^* a \\ \bar a v&|a|^2\end{pmatrix}- \begin{pmatrix}TT^* &Tv^* \\ vT^* &|v|^2+|a|^2\end{pmatrix}\\ &=\begin{pmatrix}T^* T-TT^* +v^* v &v^* a-Tv^* \\ \bar av-v T^* &-|v|^2,\end{pmatrix} \end{align*} hence $v=0$ and $T$ is normal. Since $T$ is lower triangular, $T$ is normal and by induction hypothesis $T$ is diagonal. We conclude that $A$ is diagonal.