[Math] Optimization: show that $\min \sum_{i \in I}\min f_i(y_i) = \min \sum_{i \in I}f_i(y_i)$

linear programmingmaxima-minimaoptimization

I am working on an optimization problem where I need to minimize the sum of a collection of minimization problems. The problem currently has this form:

$$\min_{x \in \mathbb{R}^m} \sum_{i \in I} g_i(x),$$

where

$$g_i(x)=\sum_{i \in I}\min_{y_i \in D_i(x)} f_i(y_i)$$

In this problem, I am trying to choose a vector $x$ which in turn helps determine the feasible regions $D_i(x)$ for a number of subproblems indexed by $i$. The decision variables $y_i$ for each subproblem are vectors of real numbers. Combining these problems gives a single, two-level "min min" problem:

$$\min_{x \in \mathbb{R}^m} \sum_{i \in I}\min_{y_i \in D_i(x)} f_i(y_i).$$

It seems clear that this is equivalent to the following single-level problem:

$$\min_{\substack{x \in \mathbb{R}^m \\ y_i \in D_i(x), \space \forall i \in I}} \sum_{i \in I}f_i(y_i).$$

The intuition behind the equivalency is that when solving the single-level problem, one would have to choose the $y_i$ vectors that minimize each $f_i(y_i)$, since those in turn will minimize the overall problem. But that's not a very satisfying "proof." Is there a more explicit way to prove this?

A stackexchange answer at https://math.stackexchange.com/a/231880/425405 says "extremizations of the same type … can be combined into a single quantifier/extremization." Is this a standard theorem that I can quote?

Best Answer

Note first that $$\min_{x\in\mathbb R^n}\min_{y_i\in D(x)\,\forall i\in I} = \min_{y_i\in D(x)\,\forall i\in I, x\in\mathbb R^n}$$ Given that, it suffices to show that for all $x$, $$\sum_{i\in I}\min_{y_i\in D(x)} f_i(y_i) = \min\left\{\sum_{i\in I}f_i(y_i):y_i\in D(x),\,\forall i\in I\right\}.$$ In other words, removing the explicit dependence on $x$ we want to show: $$\sum_{i\in I}\min_{y_i\in D} f_i(y_i) = \min\left\{\sum_{i\in I}f_i(y_i):y_i\in D,\,\forall i\in I\right\}.$$

Then the point is that the minimum of a sum of functions of distinct variables is the sum of the minima. Let $$f(x,y)=g(x)+h(y).$$ If $g(x_0)\le g(x)$ for all $x$ and $h(y_0)\le h(y)$ for all $y$ then $$f(x_0,y_0)=g(x_0)+h(y_0)\le g(x)+h(y)= f(x,y)$$ for all $(x,y)$.

More generally if $f(y_1,\dots,y_k)=\sum g_i(y_i)$ and $g_i(y_{i,0})\le g(y_i)$ for all $i$ and $y_i$ then $$f(y_{1,0},\dots,f(y_{k,0})\le f(y_1,\dots,y_k)$$ for all $y_1,\dots,y_k$.