Suppose $A, B, C$ are sets such that $B,C$ are disjoint.
Let $f : A\times (B\cup C) \rightarrow \mathbb{R}$ where for fixed $y\in B\cup C$, we have that $f(a, y)$ is linear in $a$.
Then is it true that the linear program
$\min\limits_{a\in A} \left( \max\limits_{b\in B} f(a, b) + \max\limits_{c\in C} f(a, c) \right) $
is equivalent to the linear program
$\begin{align} \min\limits_{a\in A} & \quad (z_1 + z_2) \\
\text{subject to}& \quad z_1\geq f(a,b) &\forall b\in B\\
& \quad z_2 \geq f(a,c) &\forall c\in C \end{align}$
My belief is that they are, but I just wanted to verify this.
Best Answer
Yes, this works. If you want to check this, the key facts are that:
Fact 1 tells us that the optimal value of the second problem is at most the optimal value of the first problem. Fact 2 tells us that the optimal value of the first problem is at most the optimal value of the second problem. What's more, if you solve one of the problems, you can use one of the facts to get an optimal solution to the other, as well.
So in this sense, the two programs are equivalent. (They don't have the same feasible region, though; not only is the second program higher-dimensional, but it also does have feasible points that don't correspond to feasible points of the first program.)
Some nitpicks:
Even if you are already aware of these, I want to mention them for any future readers.