[Math] Convex Conjugate of Log Sum Exp Function

convex optimization

Convex Optimization Snippet

In showing the convex conjugate of log-sum-exp function, $f(x) = \log(\sum_{i=1}^n e^{x_i})$, Boyd argues that the domain of the convex conjugate,
$$f^*(y) = \sup_{x \in D} \{y^Tx – f(x)\}$$
is $y \succeq 0$ since the conjugate would be unbounded otherwise.

I follow along by setting the derivative inside the supremum to zero, we get:

$$y_i = \frac{e^{x_i}}{\sum_{j=1}^n e^{x_j}}$$

Applying this to the convex conjugate, when I work out the domain in a similar manner to Boyd, supposing that one of the $y_k < 0$ then we have an unbounded range, by setting the corresponding $x_k$ to approach negative infinite and all other $x_i = $ for $i \neq k$.

But this is not the most general restriction, as we only need to have that $y_k \geq -1$, since in which case $y_kx_k – x_k = (y_k – 1)x_k$ bounds the supremum as $x_k$ approaches negative infinite?

Best Answer

The Conjugate Function is a Concave Function in its domain.
You can easily see that the derivative implies:

$$ {y}_{i} = \frac{ {e}^{ {x}_{i} } }{ \sum_{j = 1}^{n} {e}^{ {x}_{j} } } $$

This requires $ {y}_{i} > 0 $ (Which can be later relaxes by the definition of $ 0 \ln \left( 0 \right) $).

Related Question