[Math] Minimizing a linear combination of convex functions

calculusfunctionsoptimization

Suppose a series of convex functions $f_1(x), f_2(x), f_3(x), …$ is given (also, expressions for their derivatives $\nabla f_1, \nabla f_2,…$ are known). Now, suppose a function $g(x)$ is a linear combination of the above functions, eg, $$g(x)=a_1\cdot f_1(x)+a_2\cdot f_2(x)+a_3\cdot f_3(x)…$$ My question is: can function $g(x)$ be minimized by first optimizing each of $f_i(x)$ separately to find corresponding $x_i$ at which $f_i(x)$ is minimum, and then state that a minimum of $g(x)$ is $y=a_1x_i+a_2x_2+a_3x_3…$? I tried to argue via derivatives of each $f_i(x)$, but cannot proceed.

Best Answer

I think the answer is no. Consider the functions in the following picture: enter image description here

The minima of the sum is the entire interval between the minima of each individual function. You can imagine if you shifted either function up or down slightly, the minima would suddenly snap to one side or the other, regardless of the coefficients $a_i$.

These functions are not differentiable but that's not really an issue - just imagine smoothing out the tip of the functions a little bit and the same basic argument applies.

Related Question