[Math] Convex Conjugate of $ {L}_{1} $ and $ {L}_{2} $ Norm

convex optimizationconvex-analysis

Given a function $f$, we can define a function $f^{*}$, called the convex conjugate (also known as the Fenchel conjugate) of $f$ as follows:

$$ f^*(\vec{z})= \sup_{\vec x \in \mathbb R^d}\bigl(\vec{z}\cdot \vec{x}-f(\vec{x})\bigr)$$

I am trying to find convex conjugate of $\ell_1$ and $\ell_2$ norms of $\vec{x}$.

I can find the convex conjugates using the dual norm formula, for example, $\ell_2$ is dual norm of $\ell_2$ itself so according to this tutorial, $f^*(\vec{z})=0$ if $\|x\|_{2} \le 1$, else $\infty$.

Is there a direct way to solve this problem (not involving the dual norm)?

Best Answer

The function inside the supremum is homogeneous with respect to $x$: that is, for any $t\ge 0$ $$z\cdot (tx)-f(tx ) = t(z\cdot x -f(x) )$$ If there is $x$ for which $z\cdot x -f(x)>0$, the supremum is infinite (you can multiply that $x$ by arbitrarily large $t$). Otherwise, it is zero (take $t=0$).

If there is some $x$ for which $z\cdot x -f(x)>0$, then there is such $x$ of unit norm (by rescaling). So, the question boils down to maximizing $z\cdot x $ over unit norm vectors $x$. (Of course, this is what the dual norm is.)

For example, look at $l_1$ norm. How to maximize $\sum z_i x_i$ subject to $\sum |x_i| =1$? Pick an index $j$ for which $|z_j|$ is maximal, then set $x_j=\mathrm{sign}\,z_j$ and $x_i=0$ for $i\ne j$. This gives $\sum z_i x_i = |z_j|$. Then argue that one can't get more than that. If you know Hölder's inequality, use it to shorten the proof.

Related Question