[Math] Convex conjugate of a function of sum of norms

convex optimizationconvex-analysis

I am trying to find the conjugate of function $f(x) = \|x\|_2 + \frac{1}{2} \|x\|_2^2$ i.e.,
$f^*(v) = \sup_x (v^Tx – f(x))$ where $x \in\mathbb R^n$

Although $f(x)$ is convex, I am stuck as the function cannot be directly differentiated. I know how to find convex conjugate of norm function using dual norm but I cannot apply it here. Any help will be appreciated.

Best Answer

Since the function is radially symmetric, so is its conjugate. So you can as well consider the one-dimensional problem, with $f(x)=|x|+x^2/2$. Recall that the gradient of conjugate function is the inverse of the gradient of $f$.

Again by symmetry, it suffices to consider $x>0$ only. Since $f'(x)=1+x$, the inverse is defined only for $x\ge 1$. Imagine that the discontinuity of $f'$ at the origin "stretches" the origin into $[-1,1]$, which the gradient of the conjugate $f^*$ will collapse back into a point. So, $(f^*)'(v)=(v-1)^+$ which integrates to
$$f^*(v) = \frac12((\|v\|-1)^+)^2 $$ As usual, $a^+=\max(a,0)$.