[Math] Minimization Least Squares with Nuclear Norm Regularization (Proximal Operator)

linear algebranormed-spacesnuclear normoptimizationproximal-operators

Let $L$ and $R$ be $n \times n$ matrices. I am trying to solve the following minimization problem

$$
L^{k+1} := \arg \min_{L} \lambda \|L\|_{\ast} + \frac{1}{2\mu}\|L-R^{k}\|_{F}^{2}
$$
$$
\mbox{s.t.}\ L \succeq 0
$$

where $\|\cdot\|_{\ast}$ and $\|\cdot\|_{F}$ denote the nuclear and Frobenius norms, respectively. Also, $k$ expresses the $k$th iteration when solving an optimization problem successively.

I don't know how to solve this problem. Would you tell me the way?

Best Answer

To flesh out the comment by p.s.

Assume that we know the SVD of $L$ $$L=USV^T$$

Write down the objective function, then its differential and gradient $$\eqalign{ f &= \lambda\|L\|_* + \frac{1}{2\mu}\|L-R\|_F^2 \cr df &= \Big(\lambda\,UV^T + \frac{1}{\mu}(L-R)\Big):dL \cr \frac{\partial f}{\partial L} &= \lambda\,UV^T + \frac{1}{\mu}(USV^T-R) \cr\cr }$$ Set the gradient to zero and solve for $R$ $$\eqalign{ R &= \mu\lambda\,UV^T + USV^T \cr &= U(\mu\lambda\,I + S)V^T \cr &= U\Sigma V^T \cr }$$ So it appears that by perturbing the singular values, we arrive at the SVD of $R$.

Working backwards, let's start with the SVD of $R$ and find $L$. $$\eqalign{ \mu\lambda\,I + S &= \Sigma \implies S = \Sigma - \mu\lambda\,I \cr L &= USV^T = U(\Sigma - \mu\lambda\,I)V^T \cr\cr }$$ One last detail is to ensure that the singular values are restricted to non-negative values. $$\eqalign{ L &= U\,(\Sigma - \mu\lambda\,I)_+\,V^T \cr\cr }$$