[Math] Minimizing sum of functions implies minimizing their squares, maximizing the sum of the inverses

optimization

I have $n$ functions (Say $f_1\space to \space f_n$) of $k$ variables (Say $x_1\space to\space x_k$) each. The functions are all positive, as well as the variables $xi's$. I do not have explicit expressions for these functions.
The objective is to minimize
$$\sum_{i=1}^n{f_i} \ \ \ \ \ \ \ subject \space to \ \ \ \ \sum_{i=1}^k{x_i^2} = P$$ where P is a known constant.

In which case(s) we can assume that this is equivalent to minimizing the sum of the squares of these functions ?
i.e:$$min\sum_{i=1}^n{f_i^2} \ \ \ \ \ \ \ subject \space to \ \ \ \ \sum_{i=1}^k{x_i^2} = P$$
Also, in which case(s) we can assume that this is equivalent to MAXIMIZING the sum of the inverses of these functions ?
i.e:$$max\sum_{i=1}^n{ {1\over f_i}} \ \ \ \ \ \ \ subject \space to \ \ \ \ \sum_{i=1}^k{x_i^2} = P$$
Assuming that the functions are nicely behaved, continuous, differentiable and everything.

Best Answer

$x^{4/3}+y^{4/3}$ subject to $x^2+y^2=2$ is minimized at $(0,\sqrt2)$.

$x^{8/3}+y^{8/3}$ subject to $x^2+y^2=2$ is minimized at $(1,1)$.

So, in answer to the question in the title, minimizing the sum of functions does not necessarily minimize the sum of their squares.

Related Question