Constrained Optimization problem to unconstrained problem

convex-analysisnonlinear optimizationoptimization

I have a constrained optimization problem, I would like to convert this constrained problem to an unconstrained problem, specifically the constrained problem is constrained on convex set, which is:
$$ A \equiv \min_{x \in \mathbb{X}} \ f(x) \mbox{ where } \mathbb{X} \mbox{ is a convex set.}$$
Furthermore,$\ f(x)$ is Lipschitz continuous in which there exists some $L$ that satisfies
$$\|f(x) – f(y)\| < L\|x-y\|\ \ \ \forall x,y \in\mathbb{R}^n.$$
I would like to show the above constrained problem is equivalent to the unconstrained problem in the form $$ B \equiv\min_{x \in \mathbb{R}^n}\ f(x) \ +\ c\cdot\operatorname{dist}(x,\mathbb{X}).$$

I would like to show that $A \equiv B$ and find the constant $c$.

Best Answer

Take $c > L$. If $x \notin \mathbb X$, there is $y \in \mathbb X$ with $\|x - y\| \le (c/L) \text{dist}(x,\mathbb X)$, so $$f(x) + c\cdot \text{dist}(x,\mathbb X) \ge f(y) - L \|x-y| + c \cdot \text{dist}(x,\mathbb X) < f(y)$$ Thus a global minimum of $f$ can only occur in $\mathbb X$.

Related Question