Nonlinear global optimization: sum of minima vs minimum of sum

nonlinear optimizationoptimization

Consider $N$ disjoint nonlinear minimization problems with scalar cost functions $\{f_1,…, f_N\}$. Assume the elements of vector $X_i$ represent the decision variables of problem $f_i$ and denote $C_i$ as the minimum value of $f_i$ where $i\in \{1,…,N\}$.

I am under the impression that the following statement is correct.

If vectors X_i and X_j are pairwise independent and disjoint, then

$$
\min\left(
\sum_{i=1}^N f_i
\right) = \sum_{i=1}^N \min(f_i) = \sum_{i=1}^N C_i
$$

However, I cannot find a formal proof.

I have the following questions.

  1. Is the statement correct?
  2. If the answer to Q.1 is yes, where can I find a proof?

Thank you.

Best Answer

  1. Yes, it is correct.
  2. You can prove it easily by contradiction. If the two expressions are not equal, the smaller side provides a better solution for the larger side, contradicting minimality of the larger side.
Related Question