[Math] General solution of linear programming is a convex combination of basic solutions

linear programming

Consider a linear programming problem of the form:

$$ \max \vec c \cdot \vec x \quad \text{s.t.}$$
$$\mathbf{A} \cdot \vec x = \vec b,\quad \vec x \ge 0$$

It is known that either this problem is infeasible or it has an optimum basic solution (where the columns of $\mathbf{A}$ corresponding to non-zero entries in $\vec x$ are independent). However, there may be more than one optimal solution. In this case, I haven't found a proof (or disproof) of the following statement:

Any solution of a feasible linear programming problem is a convex combination of basic optimal solutions.

Is this true? What is the proof / counterexample? The converse is clearly true: A convex combination of two or more optimal basic solutions is also optimal.

Best Answer

Your presumption is true in the case that the set $P$ of $x$ satisfying $Ax=b,\ x\geq 0$ is a bounded nonempty set, because then the set $Q\subseteq P$ of optimal $x\in P$ by a standard theorem is the convex hull of its vertices as a polytope, which are the basic solutions of the original problem. (For a proof, take the convex hull $Q'$ of the set of vertices of $Q$ and assume $Q'\neq Q$. Then by separation we can find a face of $Q$ which has empty intersection with $Q'$. But this face must also contain a vertex of $Q$.)

If $P$ is allowed to be unbounded, then it can be the case that there is, for example, just one basic optimal solution (vertex), but the set of optimal points of $P$ is unbounded: E.g. $P=\{ x\in \mathbb{R}^2\, |\, x\geq 0\}$ , $c=(-1,0)$. The problem is to minimize the first component, which is satisfied by all vectors of type $(0,y)$ for $y\in \mathbb{R}$. This set contains the only vertex $(0,0)$, but also other solutions like $(0,1)$.