[Math] Proving that maximizing a sum of functions of different independent variables is equivalent to maximizing each function

optimizationproof-writingsummation

Let
$$
\pi = f_1(x_1) + f_2(x_2) + f_3(x_3) + \dots + f_n(x_n) = \sum_{i=1}^n f_n(x_i)
$$
where $f_i$ denote different functions and $x_i$ denote different independent variables

Would proving that $\pi$ is maximized by maximizing $f_i(x_i) \forall i$ be as simple as assuming, BWOC, that $\pi$ is not maximized by doing so, and then noting that this is a contradiction to $f_i(x_i)$ being maximized for each term, since some term must be higher?

Note that if this was written as an optimization problem, it would be unconstrained (as I believe constraints would mess with this result).

It is pretty much a trivial question/property, but I see it used here and there and have not seen why it is true discussed.

Edit: Additionally, what if each term was instead the composition of functions? That is, what if we had
$$
\pi = g(f_1(x_1)) + g(f_2(x_2)) + g(f_3(x_3)) + \dots + g(f_n(x_n)) = \sum_{i=1}^n g(f_n(x_i))
$$
I believe the same argument applies, correct? Because, once again, if maximizing each term does not maximize the sum, then some term must be larger than the maximum, which is a contradiction.

Edit 2: and it would also then apply to multivariable functions, so long as each function is of different (multiple) independent variables.

Thanks.

Best Answer

Hint:

$$\frac{\partial \pi}{\partial f_i} = 1 > 0$$