[Math] Maximizing the minimum of piecewise linear functions in high dimensional space

global optimizationoc.optimization-and-control

I'd like to compute

$\max_{x,t} t$ such that $\forall i$, $t < a_i + \|x – b_i\|_\infty$.

where $a_i,\ldots, a_n \in \mathbb R$ and $b_1,\ldots,b_n \in [0,1]^{21}$ are fixed, $x \in [0,1]^{21}$, $\|\cdot\|_\infty$ is $\sup$-norm, and $n$ is roughly $1000$.

I understand this problem cannot be solved with a linear program since the feasible set is not convex. Also, because the dimensionality of $x$ is high, it seems impractical to partition the domain into regions where the constraints are linear, and to address each region separately, before taking a max over all regions.

Is there any efficient way to get an exact solution? If not, is there any way to get a good upper bound?

Thanks much.

Best Answer

As in your previous question, this is a nonconvex optimization problem, so it won't be LP, SOCP, or SDP representable.

You've only got a 21 dimensional problem, and the constraint functions have easy Lipschitz constants. If you've got the time (hours or days of computation) and really need a fairly accurate solution (e.g. you need a solution within x% of optimal, where x% might be something like 1%), then a branch and bound approach to this global optimization problem might be appropriate. If you need a quick solution, then some kind of stochastic heuristic approach might yield a reasonable solution even if you can't prove that it's within some percent of optimal.