[Math] Inequality of Frobenius norm for skew matrices

inequalitylie-algebrasmatricesnormed-spaces

Let $A$ be a complex skew-symmetric $n \times n$ matrix, that is, $A^T = -A$. Denote by $\|\cdot\|_F$ the Frobenius norm, that is, $\|B\|_F^2 = \text{trace}(B^*B)$. I would like to prove that
$$
\big\|A^*A\big\|_F^2 \leq \frac{1}{2}\big\|A\big\|_F^4.
$$
Even better would be to prove a strict inequality, that is, to replace $\frac{1}{2}$ by a strictly smaller constant (possibly depending on $n$, possibly tending to $\frac{1}{2}$ as $n$ grows).

The strange condition of $A$ being complex but skew-symmetric is unfortunately unavoidable, since this $A$ comes essentially from the Lie algebra of $SO(n, \mathbb{C})$. For reasons connected to this other point of view, I deduced from some "too-well-known-to-give-a-reference" facts in the literature that the inequality must hold, but I have never seen a proof of that.

For arbitrary matrices $A$ a weaker inequality holds with $\frac{1}{2}$ replaced by $1$, by sub-multiplicativity. However, the equality is reached for a matrix made only of 1, which is as far from being skew-symmetric as possible. Motivated by this, I tried the computation for a skew matrix having 1 everywhere above the diagonal, but the ratio that one obtains in that case is
$$
\frac{\|A^*A\|_F^2}{\|A\|_F^4} = \frac{1}{3} \frac{n^2+n-3}{n^2-n},
$$
which tends to $\frac{1}{3}$ and does not look great for proving a general bound. If this were a general bound, however, I would be happy! But I have no reason to believe that, other than I have no better guesses…

Thanks in advance for any help!

EDIT: I still do not know how to prove the inequality in general, but at least I can now prove that the inequality is optimal, since if one takes any (nonzero…) matrix of the form
$$
A = \begin{pmatrix}
0 & \dots & 0 & v_1\\
\vdots & \ddots & \vdots & \vdots\\
0 & \dots & 0 & v_{n-1}\\
-v_1 & \dots & -v_{n-1} & 0
\end{pmatrix}
$$
then $\text{trace}(A^*A) = -2\|v\|^2$ and $\text{trace}\big((A^*A)^2\big) = 2\|v\|^4$, realizing the equality. Thus restricting to real-valued matrices does not seem to make you lose anything.

Best Answer

The displayed inequality is a direct consequence of Youla decomposition of complex skew-symmetric matrices. More specifically, we can always decompose a complex skew-symmetric matrix $A$ as $U(D_1\oplus\cdots\oplus D_k)U^T$, where $U$ is a unitary matrix and every $D_j$ is either a $2\times2$ complex skew-symmetric matrix or a zero block. As $A^\ast A=\bar{U}(D_1\oplus\cdots\oplus D_k)^\ast(D_1\oplus\cdots\oplus D_k)U^T$ and $$ \pmatrix{0&\bar{\lambda}\\ -\bar{\lambda}&0}\pmatrix{0&-\lambda\\ \lambda&0}= \pmatrix{|\lambda|^2&0\\ 0&|\lambda|^2}, $$ it follows that all nonzero (hence positive) eigenvalues of $A^\ast A$ must occur in pairs. If we label these eigenvalues as $s_1,s_1,s_2,s_2,\ldots,s_k,s_k$ (so that $A^\ast A$ has rank $2k$), the displayed inequality is equivalent to $$ 2\sum_1^k s_j^2\le\frac12\left(2\sum_1^ks_j\right)^2, $$ which is obviously true. And equality holds iff $A^\ast A$ has at most one pair of nonzero eigenvalues, meaning that $A$ has at most rank two.