[Math] Prove convexity/concavity of a complicated function

convex optimizationconvex-analysis

Can anyone help me to prove the convexity/concavity of following complicated function…? I have tried a lot of methods (definition, 1st derivative etc.), but this function is so complicated, and I finally couldn't prove… however, I plotted with many different parameters, it's always concave to $\rho$…

$$ f\left( \rho \right) = \frac{1}{\lambda }(M\lambda \phi – \rho (\phi – \Phi )\ln (\rho + M\lambda ) + \frac{1}{{{e^{(\rho + M\lambda )t}}\rho + M\lambda }}\cdot( – (\rho + M\lambda )({e^{(\rho + M\lambda )t}}{\rho ^2}t(\phi – \Phi ) ) $$
$$+ M\lambda (\phi + \rho t\phi – \rho t\Phi )) + \rho ({e^{(\rho + M\lambda )t}}\rho + M\rho )(\phi – \Phi )\ln ({e^{(\rho + M\lambda )t}}\rho + M\lambda ))$$

Note that $\rho > 0$ is the variable, and $M>0, \lambda>0, t>0, \phi>0, \Phi>0 $ are constants with any possible positive values…

Best Answer

I am a newcomer so would prefer to just leave a comment, but alas I see no "comment" button, so I will leave my suggestion here in the answer box.

I have often used a Nelder-Mead "derivative free" algorithm (fminsearch in matlab) to minimize long and convoluted equations like this one. If you can substitute the constraint equation $g(\rho)$ into $f(\rho)$ somehow, then you can input this as the objective function into the algorithm and get the minimum, or at least a local minimum.

You could also try heuristic methods like simulated annealing or great deluge. Heuristic methods would give you a better chance of getting the global minimum if the solution space has multiple local minima. In spite of their scary name, heuristic methods are actually quite simple algorithms.

As for proving the concavity I don't see the problem. You mention in your other post that both $g(\rho)$ and $f(\rho)$ have first and second derivatives, so it should be straightforward, right?

Related Question