An Example of KKT Problem

karush kuhn tuckernonlinear optimizationoptimization

I tried using KKT sufficient condition on the problem $$\min_{x\in X} \langle g, x \rangle + \sum_{i=1}^n x_i \ln x_i \text{ s.t. } \sum_{i=i}^n x_i = 1,
\text{ where } X:= \{x\in\mathbb{R}^n | x_i >0, \forall i=1,\ldots, n\}.$$

Upon solving:

  • Primal feasibility: $\sum x_i = 1$

  • Dual feasibility: $g + \sum (\ln x + 1) + u(\sum x-1)=0.$

Textbook's solution: $x=e^{-g_t}/\sum_{k=1}^n e^{-g_k}.$ Are there any better options?

Best Answer

With the KKT conditions I would always recommend writing things out variable by variable, it often makes things a lot clearer.

So by the Lagrangian part of the KKT conditions we have: $$ g_{i} + \ln(x_{i}) + 1 + u = 0. $$ Solving for $\ln(x_{i})$ it should be obvious that $$ \ln(x_{i}) = -g_{i} - 1 - u. $$ Hence we conclude that $$ x_{i} = e^{-g_{i} - 1 - u} $$ But remember from our primal feasibility condition we must have: $$ \sum x_{k} = \sum e^{-g_{k} - (1+u)} = \frac{\sum e^{-g_{k}}}{e^{1+u}} = 1. $$ I am just using standard properties of the exponential here, nothing special. Thus we conclude that: $$ e^{1+u} = \sum e^{-g_{k}} $$ Plugging this back into our formula for $x_{i}$ we find that $$ x_{i} = e^{-g_{i} - 1 - u} = \frac{e^{-g_{i}}}{e^{1+u}} = \frac{e^{-g_{i}}}{\sum e^{-g_{k}}} $$ Of course, note that you actually have $n$ inequality constraints to consider when computing the Lagrangian as you are optimising over the simplex, but in this case it doesn't matter.

Related Question