Converting a supermodular optimization problem to submodular optimization

convex optimizationdiscrete mathematicsdiscrete optimizationoptimization

A constrained concave maximization problem can be converted to a constrained convex minimization problem by negating the objective function and keeping the constraints intact. But in case of constrained supermodular maximization to submodular minimization it seems we have to change the constraints also.

For example, if I have a supermodular optimization problem,

\begin{equation}
\max_{S \subseteq V}\, f(S)\quad \text{s.t}\, \lvert S \rvert \leq k
\end{equation}

to convert to a submodular optimization problem, it does not make sense to use the constraint $\lvert S \rvert \leq k$. The only constraint which makes sense here is $\lvert S \rvert \geq k$. Otherwise, we can always return a non-empty solution by setting $k=1$. Does that mean that we need to reverse the inequality constraints in case of discrete optimizaton ?

EDIT: Assume the function is normalized $(f(\emptyset) = 0$ and monotonic increasing. In this case I think maximization of $f$ is equivalent to minimization of $-f$.

Best Answer

Let $f:2^{\{1,2\}} \rightarrow \mathbb{R}$ with

$$f(\{1\})=100$$ $$f(\{2\})=\frac{1}{2}$$ $$f(\{1,2\})= 101$$ $$f(\emptyset)=0$$

$f$ is supermodular. The solution of \begin{equation} \max_{S \subseteq V}\, f(S)\quad \text{s.t}\, \lvert S \rvert \leq k \end{equation}

for $k=1$ is $\{1\}$ with $f(\{1\})=100$.

If we now minimize $-f$ and turn around the constraint direction, we obtain the following problem:

\begin{equation} \min_{S \subseteq V}\, -f(S)\quad \text{s.t}\, \lvert S \rvert \geq 1 \end{equation}

Now, we obtain the optimum is at $S=\{1,2\}$. Thus, the two problems are not equivalent and just switching the constraint direction is not enough.

Related Question