Computing the convex conjugate of this function

convex optimizationconvex-analysisreal-analysis

Let $\lambda \in \mathbb R^+$, $A \in \mathbb R^{m\times n}$, $x \in \mathbb R^n$, and $b, y \in \mathbb R^m$, and
$$
G(r) = \frac{1}{2} \| r\|^2_2 + \lambda \| x\|_1 + \langle y, Ax – b – r\rangle, \quad r \in \mathbb R^m.
$$

Compute $G^*$, the convex conjugate function of $G$, given by

$$
G^*(z) = \sup_{u} (\langle z, u\rangle – G(u))
$$

This is my attempt:

\begin{align*}
G^*(z) &= \sup_{u} (\langle z, u\rangle – G(u))\\
&= \sup_{u}(\langle z, u\rangle – \frac{1}{2}\| u\|^2_2 – \langle y, Ax – b – u\rangle – \lambda \| x\|_1)\\
&= \sup_{u}(\langle z, u\rangle – \frac{1}{2}\langle u, u\rangle – \langle y, Ax – b\rangle + \langle y, u\rangle – \lambda \| x\|_1)\\
&= \sup_{u}(\langle z, u\rangle) – \frac{1}{2}\langle u, u\rangle + \langle y, u\rangle) – \langle y, Ax – b\rangle – \lambda \| x\|_1\\
&= \sup_{u}(\langle z + y , u\rangle – \frac{1}{2}\langle u, u\rangle) – \langle y, Ax – b\rangle – \lambda \| x\|_1\\
&= \frac{1}{2}\| z + y\|^2_2 – \langle y, Ax – b\rangle – \lambda \| x\|_1
\end{align*}

Is this correct?

Best Answer

Yup, it is correct.

The last two terms of the solution is obvious.

To evaluate $\sup_u \langle z+y, u \rangle - \frac12 \langle u, u \rangle$,

To fidn the optimal $u$, we can differentiate it and equate it to $0$.

$$z+y -u=0,$$ hence the maximum value is attained at $z+y$.

$$\langle z+y, z+y\rangle-\frac12\langle z+y, z+y\rangle= \frac12\langle z+y, z+y\rangle=\frac12\|z+y\|^2_2$$