The family of matrices $U^{T}BU$, where $B$ is a fixed, positive definite matrix $\mathbb{R}^{n\times n}$, and $U$ varies over the orthogonal group $O(n)$, is obtaining by rigidly rotating and reflecting the eigenvectors of $B$. The matrix $B$ is invariant under such a transformation iff its eigenspaces are preserved. Even if there are $n$ distinct eigenvalues (so that all eigenspaces are $1$-dimensional), there are $2^n$ discrete choices for $U$ that preserve $B$: namely, reflections of any subset of the eigenvectors. Note that these form a discrete subgroup of $O(n)$ under matrix multiplication: it can be represented as $O(1)^n$. When eigenvalues are degenerate, then additional orthogonal transformations of the higher-dimensional eigenspaces will preserve the matrix $B$. In general, if the eigenspaces of $B$ associated with eigenvalues $\lambda_1 < \lambda_2 < ... < \lambda_k$ have dimensions $d_1,d_2,...d_k$, with $d_1+d_2+...+d_k=n$, then the subgroup of $O(n)$ that preserves $B$ is isomorphic to $O(d_1)\times O(d_2) \times ... \times O(d_k)$.
Yes. Quoting Halmos's Linear algebra problem book (Solution 160).
“If $A$ and $B$ are real, $U$ is unitary, and $U^*AU = B$, then there exists a real orthogonal $V$ such that $V^*AV = B$.
A surprisingly important tool in the proof is the observation that the unitary equivalence of $A$ and $B$ via $U$ implies the same result for $A^*$ and $B^*$. Indeed, the adjoint of the assumed equation is $U^*A^*U = B^*$.
Write $U$ in terms of its real and imaginary parts $U = E + i F$. It follows from $AU = UB$ that $AE = EB$ and $AF = FB$, and hence that $A(E+\lambda F) = (E+\lambda F)B$ for every scalar $\lambda$. If $\lambda$ is real and different from a finite number of troublesome scalars (the ones for which $\det(E+\lambda F) = 0$), the real matrix $S = E + \lambda F$ is invertible, and, of course, has the property that $AS=SB$.
Proceed in the same way from $U^*A^*U = B^*$: deduce that $A^*(E+\lambda F) = (E+\lambda F)B^*$ for all $\lambda$, and, in particular, for the ones for which $E+\lambda F$ is invertible, and infer that $A^*S = SB^*$ (and hence that $S^*A = BS^*$).
Let $S =VP$ be the polar decomposition of $S$ (that theorem works just as well in the real case as in the complex case, so that $V$ and $P$ are real.) Since $$BP^2 = BS^*S = S^*AS = S^*SB = P^2B,$$ so that $P^2$ commutes with $B$, it follows that $P$ commutes with $B$. Since $$AVP = AS = SB = VPB = VBP$$ and $P$ is invertible, it follows that $AV=VB$, and the proof is complete.”
Needless to say, that isn't the shortest path to prove the reduction of antisymmetric matrices...
Best Answer
Let $a, b \in \mathbb{C}\setminus\mathbb{R}$ such that $$ a^2+b^2 \in \mathbb{C}\setminus\mathbb{R},\ |a|^2+|b|^2=1, \ a\bar{b} \in \mathbb{R}. $$ Setting $$ A=\left[\begin{array}{cc}a&b\cr -b&a\end{array}\right], $$ we have $$ AA^*=\left[\begin{array}{cc}a&b\cr -b&a\end{array}\right]\cdot\left[\begin{array}{cc}\bar{a}&-\bar{b}\cr \bar{b}&\bar{a}\end{array}\right] =\left[\begin{array}{cc}1&0\cr 0&1\end{array}\right] , $$ but $$ AA^T=\left[\begin{array}{cc}a&b\cr -b&a\end{array}\right]\cdot\left[\begin{array}{cc}a&-b\cr b&a\end{array}\right] =\left[\begin{array}{cc}a^2+b^2&0\cr 0&a^2+b^2\end{array}\right] \ne \left[\begin{array}{cc}1&0\cr 0&1\end{array}\right]. $$