[Math] Lagrangian dual for the sum of norms

convex optimizationduality-theoremsoptimization

I would like some help in deriving the Lagrangian dual function of a sum-of-norms minimization problem :

$\sum{||A_{i}x-b_{i}||}$ when $A_{i}$ are matrices, and $b_{i},x$ vectors.

I understand I can rewrite the problem as a minimization on (x,y) and define the new target function as :

$\sum{||y_{i}||}$ under constraints stating that $y_{i}=A_{i}x-b{i}$ (when $y_{i}$ are vectors of course). So these are in fact linear constraints, right?

But what now? Can I use the conjugate of the L2 norm function in order to derive the Lagrangian dual function? Or can I derive it directly? ( I prefer to see a direct derivation by calculating the min of the Lagrangian on x with a fixed multiplier.).

Thank you!

Best Answer

After introducing the linear constraints, you just form the Lagrangian and minimize it: $$\mathcal{L}(x,y_i,\lambda_i)=\sum_i \|y_i\|+\lambda_i^T(y_i - A_i x+b_i)$$

$$\inf_{x, y_i} \mathcal{L}(x,y_i,\lambda_i) =\begin{cases} -\infty & \text{ if } \sum_i A_i^T\lambda_i \ne 0 \ \text{ or } \ \|\lambda_i\| > 1 \\ \sum_i b_i^T \lambda_i & \text{ otherwise } \end{cases} $$ So the dual problem is: $$ \begin{gathered}\max_{\lambda_i} \sum_i b_i^T \lambda_i\\ \text{s.t.}: \sum_i A_i^T\lambda_i = 0, \\ \quad\|\lambda_i\| \le 1, \ \forall \ i. \end{gathered} $$

Related Question