Solution to $ \arg \min_{x} (\frac{1}{2\lambda}||x- y||^2_2+\lambda_0||Bx||_1) $: Least Squares with $ L_1 $ Norm of Linear Term (Proximal Operator)

convex optimizationoptimizationproximal-operators

$$\arg\min_x \left( \frac{1}{2\lambda} \|x – y\|^2_2 + \lambda_0 \|Bx\|_1 \right)$$

where $\lambda_0,B,\lambda, y$ are known, and $x \in \mathbb R^{n\times 1}$. Is there a closed form solution to it?

It seems like a $\ell_1$-regularized optimization problems but there is a arbitrary matrix $B$, does soft-thresholding method work here?

(I'm trying to use proximal operator to optimize this:)

$$ \arg \min_{x} \frac{1}{2}||Ax-b||^2_2+\lambda||Bx||_1$$

Best Answer

This would be very long as a comment, so posting here as an answer.

So, let $S := BB^T$. The using the dual representation of $\ell_1$-norm and then Sion's minimax theorem, one computes $$ \begin{split} \frac{1}{2}\|x-a\|^2 + \lambda\|Bx\|_1 &= \frac{1}{2}\|x-a\|^2 + \gamma \sup_{\|z\|_\infty \le 1}z^TBx=\sup_{\|z\|_\infty \le 1}\inf_{x}\frac{1}{2}\|x-a\|_2^2 + \lambda z^TBx\\ &= \sup_{\|z\|_\infty \le 1} z^Ta + \lambda z^TSz, \end{split} $$ which can be effectively solved via a projected gradient algorithm. Note that the gradient of the objective is $a + Sz$, while projection onto the $\ell_\infty$ unit if very easy (reducible to 1D minimization of quadratic function on an interval).