[Math] Characterizing orthogonally-invariant norms on the space of matrices

linear algebramatricesnormed-spacessymmetry

Denote by $M_n$ the space of $n \times n$ real matrices. We say a norm on $M_n$ is orthogonal invariant if: $$\|OX \|=\| XO\|=\|X \| \, \, \forall O \in O_n,X \in M_n$$

I am trying to characterize all such norms. Let $\|\cdot \|$ be one.
Using singular values decomposition, we get:

$\|X\|=\|U\Sigma V^T\|=\|\Sigma \|$ (when $U,V \in O_n, \Sigma$ is a diagonal with non-negative entries, i.e $U \Sigma V^T$ is a SVD of $X$)

So, if we define $f:\mathbb{R}^n \to \mathbb{R}^{\ge 0}$ by $f(\sigma_1,\dots,\sigma_n)=\|\operatorname{diag}(\sigma_1,\dots,\sigma_n) \|$ we get that $\|\cdot\|$ is uniquely determined by $f$. Note that $f$ must be a norm which is invariant under signed permutations, i.e: $f(\pm x_{\tau(1)},\dots,\pm x_{\tau(n)})=f(x_1,\dots,x_n)$ for every $\tau \in S_n$. (The reason is that $\| \cdot\|$ is invariant under orthogonal multiplication, in particular by signed permutation matrices).

Question: Is it true that any norm on $\mathbb{R}^n$ which is invariant under signed permutations induces an orhtogonal-invariant norm on $M_n$? (in the obvious way as described above). If not, can we characterize which norms are possible?

Added Clarification: Let $f$ be a norm on $\mathbb{R}^n$. The possible candidate for a norm on $M_n$ induced by $f$ is: $\| X\|=\|U\Sigma V^T \|=f(\sigma_1,\dots,\sigma_n)$ where $\Sigma=\operatorname{diag}(\sigma_1,\dots,\sigma_n) $.

The non trivial part seems to be verifying the triangle inequality, since SVD does not behave in a structured way (known to me) w.r.t sums.

Remarks and partial results:

1) Choosing $f$ to be the maximumn norm induces the Euclidean operator norm. (see here).

2) Choosing $f$ to be the standard $p$-norm, one gets the $p-$ Schatten norm.

3) Here is a sufficient condition (which is not necessary) inducing a norm:
$$\sum_{i=1}^n z_i \le \sum_{i=1}^n x_i + \sum_{i=1}^n y_i \Rightarrow f(z_1,\dots,z_n) \le f(x_1,\dots,x_n) + f(y_1,\dots,y_n)$$

Any $f$ which satisfies the above condition induces a norm. This follows from Lidskii inequality which says:

$$\sum_{i=1}^n \sigma_i(A+B) \le \sum_{i=1}^n \sigma_i(A) + \sum_{i=1}^n \sigma_i(B) $$

Note this condition is not necessary: The maximum norm does not satisfy it, but induces a norm.

Best Answer

I think the answer is yes. A sign- and permutation- invariant norm defined on $\mathbb C^n$ is called a symmetric gauge function. It is known that every unitarily invariant norm on $M_n(\mathbb C)$ is induced by a symmetric gauge function. See, e.g. theorem 7.4.24 on pp.438-440 of Horn and Johnson (1985), Matrix Analysis, 1/e, Cambridge University Press.

To deal with the triangle inequality, Horn and Johnson considered the dual norm of $f$. Although the theorem focuses on a norm $f$ defined on $\mathbb C^n$, since the dual norm of a norm defined on $\mathbb R^n$ is also a norm on $\mathbb R^n$ and since every real matrix admits a real SVD, apparently their proof can be adapted to the real case almost without modification. Yet, you still need to double-check whether this adaptation really works or not.