[Math] Normal form of a skew symmetric matrix

linear algebra

I need to show that a non-degenerate skew symmetric matrix $A$ of even degree is similar to the blockdiagonal matrix
$$\begin{pmatrix}
D_1 &0&\dots &0\\
0 &\ddots &0 &\vdots\\
\vdots&0 &\ddots&0\\
0 &\dots &0&D_n
\end{pmatrix} ,
\text{where } D_i=\begin{pmatrix} 0&-\lambda_i\\
\lambda_i &0\end{pmatrix}.$$

So far, I observed that the matrix $iA$ is hermitian, so it can be diagonalized and further also $A$ can be diagonalized with purely imaginary eigenvalues. By skew symmetry, the eigenvalues come in pairs $\pm i\lambda_i$.
I guess that the $\lambda_i$ in $D_i$ and the diagonalmatrix coincide, as the $D_i$ can be diagonalized by $S=\begin{pmatrix}i&-i\\1&1 \end{pmatrix},$ $D_i=S\begin{pmatrix}-i\lambda_i&0\\0&i \lambda_i \end{pmatrix}S^{-1}$.

It still remains to show that we can find real matrices which do the similarity transformation, i.e. we need to find a real basis in which the bilinearform is of the blockdiagonal form.

I read that this basis is even orthonormal. Why is that?

Edit: I already looked at the similar question Proof of the Wirtinger inequality, where it is stated that one has to " Follow[] the usual linear algebra protocol (taking real and imaginary parts of the complex eigenvectors)". But I have no clue what that means.

Best Answer

Some comments / remarks towards the solution.

First the statement is not correct. In particular, a skew symmetric matrix can be similar to a matrix $$\begin{pmatrix} D_1 &0&\dots &0 &&&\\ 0 &\ddots &0 &\vdots &&&\\ \vdots&0 &\ddots&0 &&&\\ 0 &\dots &0&D_n\\ &&&&0&&\\ &&&&&\ddots& \\ &&&&&&0 \\ \end{pmatrix}$$

Second You can prove that any real eigenvalue is zero.

Third Consider the case where the dimension of the linear space $\dim E = 2$. Then the minimal polynomial of $A$ is either $\mu_A(X)=X$ or a real polynomial $\mu_A(X)=X^2+bX+a$ with $a \neq 0$. If you suppose that you're in the second case, then you can prove that the matrix of $A$ in any orthonormal basis has the form $$D=\begin{pmatrix} 0&-\lambda\\ \lambda &0\end{pmatrix}.$$

Four To move to the general case, you can consider the minimal polynomial of $A$ which is the product of polynomials of degree $2$ (and potentially the polynomial $X$). For each of those degree 2 polynomials, you have a vector that vanishes it. If not, the minimal polynomial would be of smaller degree. Based on that, you can proceed by induction.