Linear programming – optimality conditions

linear programmingoptimization

From Bertsimas intro to linear optimization – exercise 3.7:

Consider a feasible solution x to a standard linear program:

\begin{align*}
\min\quad & \textbf{c'x} \\ \text{subject to } & \textbf{Ax=b} \\
& x \geq 0 \ \\
\\
\text{And let } & Z = \{i | x_i=0\}
&
&
\end{align*}

Show that x is an optimal solution if and only if the linear program:

\begin{align*}
\min\quad & \textbf{c'd} \\ \text{subject to } & \textbf{Ad=0} \\
& d_i \geq 0, \ i \in \{Z\} \\
\\
\
&
&
\end{align*}

Has an optimal value of zero.

I understand that for two solutions of the original program, x and y, we can define d=y-x. Then Ad=0 and probably it relates to the optimally condition of c'd>=0. But I still don't understand how it relates to the Z set and how to prove the direction from 2->1

Best Answer

Write $$c'x = c'y + c'(x-y),$$ where $y$ is also a solution of the LP. Let $$d = y - x.$$ Clearly, $Ad = 0$, and $ d_i \geq 0, \ i \in \{Z\}$.

Now, if the second LP has minimum value of $0$, this means that $$c'd \geq 0. $$ Hence, $$c'x \leq c'y,$$ and the result is proven.

Conversely, take $d$ solution if the second LP. Define, $$y = x + \epsilon d.$$ Clearly, $Ay = b$, and it is possible to find $\epsilon$ such that $y \geq 0$. As $x$ is optimal, this implies $c'd \geq 0$, and obviously $0$ is reached for $d = 0$.