[Math] sufficient condition for KKT problems

convex optimizationnonlinear optimizationoptimization

For the Karush-Kuhn Tucker optimsation problem, Wikipedia notes that:

"The necessary conditions are sufficient for optimality if the objective function f and the inequality constraints g_j are continuously differentiable convex functions and the equality constraints h_i are affine functions."
link

Could someone please show me how this result is derived? That is, given a convex objective function, convex inequality constraints and affine equality constraints, how can we show that any point in the feasible set that satisfies the KKT conditions must be a minimizer of the function over the feasible set?

Best Answer

The primal problem is \begin{align} \operatorname{minimize}_x & \quad f_0(x) \\ \text{subject to} & \quad f_i(x) \leq 0 \quad \text{for } i = 1,\ldots, m\\ & \quad Ax = b. \end{align} The functions $f_i, i = 0,\ldots,m$ are differentiable and convex.

Assume $x^*$ is feasible for the primal problem (so $f_i(x^*) \leq 0$ for $i = 1,\ldots,m$ and $A x^* = b$) and that there exist vectors $\lambda \geq 0$ and $\eta$ such that $$ \tag{$\spadesuit$}\nabla f_0(x^*) + \sum_{i=1}^m \lambda_i \nabla f_i(x^*) + A^T \eta = 0 $$ and $$ \lambda_i f_i(x_i) = 0 \quad \text{for } i = 1,\ldots,m. $$ Because the functions $f_i$ are convex, equation ($\spadesuit$) implies that $x^*$ is a minimizer (with respect to $x$) of the Lagrangian $$ L(x,\lambda,\eta) = f_0(x) + \sum_{i=1}^m \lambda_i f_i(x) + \eta^T(Ax - b). $$ Thus, if $x$ is feasible for the primal problem, then \begin{align*} f_0(x) & \geq f_0(x) + \sum_{i=1}^m \lambda_i f_i(x) + \eta^T(Ax - b) \\ & \geq f_0(x^*) + \sum_{i=1}^m \lambda_i f_i(x^*) + \eta^T(Ax^* - b) \\ & = f_0(x^*). \end{align*} This shows that $x^*$ is a minimizer for the primal problem.