Converting this non-linear minimization problem to linear

calculuslinear programminglinearizationnonlinear optimization

My Problem: I would like to convert the following non-linear minimization problem into a linear programming problem, to solve it with the simplex method.

The non-linear function could be of any shape but with only one stationary point (the global minimum), supposing, for example, $f(x)=(x+3)^2$.
My minimization problem is like the following (but can potentially have more levels of nested functions):

$$
\min_{x_1, x_2} \quad f(3f(-x_1 + x_2) + f(2x_1-x_2)) \\
s.t \quad 0 < x_1 < 1 \quad \land \quad -1 < x_2 < 1
$$


What I tried: I thought to iteratively reduce my problem to another minimization problem on the argument of the function, trying to keep the argument as close as possible to the minimum. Something like that:

$$
\min_{x_1, x_2} \quad (3f(-x_1 + x_2) + f(2x_1-x_2) – argmin)^2 \\
s.t \quad 0 < x_1 < 1 \quad \land \quad -1 < x_2 < 1
$$

But I'm not sure if this could be a general solution and, moreover, I'm not even sure that a solution for my problem exists. Thanks in advance for any insight!

Best Answer

This is not possible in general because the simplex method only works for linear objective and linear constraints. However, for the specialized case of a quadratic objective function a variant of the simplex method can be applied (search for Quadratic Programming Simplex Method by Philip Wolfe for details).

For your problem, can you compute gradients of your objective? If so, I suggest you use LBFGS-B, a variant of the well-known LBFGS method for nonlinear optimization that can also accommodate variable bounds. You can find this method implemented in e.g. Python in the scipy.optimize library.

Related Question