Convex Optimization – Deriving the Sub-Differential of the Nuclear Norm

convex optimizationmatricesmatrix-normsnon-smooth-analysisnuclear norm

Let $$f(K) = \| K \|_*$$ be the nuclear norm (sum of the singular values) of $K=U\Sigma V^T$. How can one compute the subdifferential $\partial f$?

This may be a basic question, I'm trying to work my way through a paper in which minimizing $f$ over a convex set of matrices plays a central role. For what it's worth, I have found papers that display the end result, but not the derivation.

EDIT: This paper by Tao and Candes derives an expression, but refers the proof to "Characterization of the subdifferential of some matrix norms" which does not prove it as far as I can tell. I also found a class homework assignment posted online that said this was easy to "grind out" with matrix derivatives, but that there was another way via projections. Any guidance would be greatly appreciated.

Best Answer

Contrary to my first impression, this question is answered completely and thoroughly by G.A. Watson in Characterization of the Subdifferential of Some Matrix Norms. For the final derivation, see pg. 40.

The conclusion is that

$$\partial \|K\|_* = \left\{ UV^T+W:\ \ \ \|W\|<1, \text{columnspace}(U) \perp W\perp\text{ rowspace(V)} \right\},$$

where $\|\cdot\|$ is the spectral norm, which is dual to the nuclear norm.

Related Question