[Math] Removing a max function in the constraints

linear programmingnonlinear optimizationoperations research

Can the following problem be transformed into a linear programming problem:

Find $x_1,..,x_N$ which maximizes the objective function

$$\sum_{i=1}^{N}x_{i}\sum_{j=1}^{n_{i}}c_{ij}$$

subject to the constraints

$$\sum_{i=1}^{N}\sum_{j=1}^{n_{i}}b_{ij}\max\left(x_{i}c_{ij}-p_{ij},0\right)+x_{i}f_{i}\sum_{j=1}^{n_{i}}c_{ij}\leq C$$

$$x_{i}\geq0$$
where $c_{ij}$, $b_{ij}$, $n_i$, $f_i$ and $C$ are known.

Is there a way to remove the max function from the constraints, perhaps by adding additional linear constraints? If not what would be the a good approach to solve this problem?

Best Answer

If you don't mind adding $\sum_i n_i$ new variables (and assuming $b$'s are non-negative), then you could introduce $y_{ij}$ such that \begin{align} y_{ij} &\geq x_ic_{ij}-p_{ij}\\ y_{ij} &\geq 0 \end{align}

And then modify the constraints for $x$'s subtituting $y$'s: \begin{align} \sum_{i=1}^{N}\sum_{j=1}^{n_{i}}b_{ij}y_{ij}+x_{i}f_{i}\sum_{j=1}^{n_{i}}c_{ij}&\leq C \tag{$\spadesuit$}\\ x_i &\geq 0 \end{align}

The objective function stays the same as before, that is, $$\sum_{i=1}^{N}x_{i}\sum_{j=1}^{n_{i}}c_{ij}.$$

Why does it work? Any solution for your original onstraints will have some $y$'s that make it work (i.e. equal to the maximum from previous constraints). On the other hand, assuming that $b$'s are positive, any solution with new constraints can be modified to obtain old constraints without invalidating it (if some $y_{ij}$ is not equal to the max from previous constraints, it could only be bigger; we can decrease it and the inequality $\spadesuit$ will still be satisfied because $b_{ij}$ was positive). Hence optimum calculated with new constraints won't be better nor worse than optimum for old constraints.

I hope this helps $\ddot\smile$