Solve linear program $\min \langle c, x \rangle$ using Lagrangian

convex optimizationkarush kuhn tuckerlinear programmingoptimization

Given the following linear programming problem

$$
\min \langle c, x \rangle\\
\begin{align}
\text{s.t} \,\,\,\,\,\,\,& \sum_{i=1}^{n}x_i=1\\
&x\geq0
\end{align}
$$

where $x \in \mathbb{R}^n$.

My try:
Since the objective function is affine, it is convex. Also, the first constraint can be written as $e^{\top}x = 1$ where $e = [1, \cdots, 1]^{\top}$ is affine. In addition, $-x\leq0$ is a convex constraint. Therefore, a KKT point is the minimizer. The Lagrangian is as follows:
$$
L(x, \lambda, \mu)= \langle c, x \rangle + \lambda^{\top}(-x) + \mu e^{\top}(x)
$$

where $\lambda\geq 0$. Taking the derivative
$$
\begin{align}
\frac{\partial L}{\partial x_i} &= c_i – \lambda_i + \mu = 0\\
\lambda_i x_i &= 0
\end{align}
$$

There are two cases, either $\lambda_i> 0$ or $\lambda_i = 0$. If $\lambda_i> 0$, then $x_i = 0$ and $\mu = -c_i$ but this cannot hold because $\mu$ is scalar and $c_i$'s are different. Also, I have no idea to do the case $\lambda_i = 0$.

Please complete my solution and do not use methods for solving linear programming.

Best Answer

There's a typo in your expression for the derivative of the Lagrangian. It should be $$ \frac{\partial L}{\partial x_i}=c_i{\color{red}-}\lambda_i+\mu\ . $$ It's probably easier to recognise the solution of this problem by guesswork than by trying to solve the Karush-Kuhn-Tucker conditions, but here's one way of doing the latter.

Since $\ \sum_\limits{i=1}^n x_i=1\ $, then $\ x_i>0\ $ for at least one $\ i\in\{1,2,\dots,n\}\ $. Let $\ i_1, i_2, \dots, i_r\ $ be those indices for which $\ x_{i_j}>0\ $. From the condition $\ \lambda_i x_i=0\ $, we then have $\ \lambda_{i_j}=0\ $ for $\ j=1,2,\dots,r\ $. Then from the conditions $\ \frac{\partial L}{\partial x_{i_j}}=0\ $, we get $\ \mu=-c_ {i_j}\ $. Thus, for all the indices $\ i_j\ $, $\ \mu=-c_ {i_j}\ $ must have the same value. So if the $\ c_i\ $ are all different, we must have $\ r=1\ $, and $\ x_{i_1}=1\ $ will be the only variable with a positive value. In any case, for any index $\ i \not\in\{i_1,i_2,\dots,i_r\}\ $, the conditions $\ \frac{\partial L}{\partial x_i}=0\ $ give $$ \lambda_i =c_i+\mu= c_i-c_{i_j}\ge 0\ . $$ That is $\ c_{i_j}\le c_i\ $ for all $\ j=1,2,\dots,r\ $ and $\ i \not\in\{i_1,i_2,\dots,i_r\}\ $. In other words, $\ c_{i_j}=\min(c_1,c_2,\dots,c_n)\ $ for all $\ j=1,2,\dots,r\ $.

If $\ i_1\ $ is the only value of $\ i\ $ for which $\ c_i=$$ \min(c_1,c_2,\dots,c_n)\ $, then the solution is unique: $\ x_{i_1}=1\ $, and $\ x_i=0\ $ for $\ i\ne i_1\ $. If $\ r>1\ $, however, then any assignment of non-negative values to $\ x_{i_1}, x_{i_2}, \dots, x_{i_r}\ $ such that $\ \sum_\limits{i=1}^r x_{i_j}=1\ $, and setting $\ x_i=0\ $ for $\ i \not\in\{i_1,i_2,\dots,i_r\}\ $, will achieve the minimum, $\ \min(c_1,c_2,\dots,c_n)\ $, of the objective.