Convexification of Difference of Convex Functions

approximation-theoryconvexityreal-analysis

I am looking for a reference/a hint to the following problem:

We are given $f_1(x),f_2(x)$ convex functions (say, on $\mathbb R^d$) such that $f_1(x) \to\infty$ for $\|x\|\to\infty$. Also there is an $\alpha > 0$ such that $f_1(x) – \alpha f_2(x) \geq 0$.

A first consequence is that for any $\epsilon > 0$,
$$f_1(x) – (\alpha – \epsilon)f_2(x) \to \infty, \quad \|x\|\to \infty$$
because $f_1(x) – (\alpha – \epsilon)f_2(x) = \frac{\epsilon}{\alpha }f_1(x) + \frac{a-\epsilon}{\alpha }(f_1 – \alpha f_2(x)) \geq \frac{\epsilon}{\alpha}f_1(x) \to \infty$ for $\|x\|\to \infty$.

Now my question is whether (and maybe under which circumstances) it is possible to find a local approximation $g_2(x)$ such that $|g_2(x)-f_2(x)| \to 0$ for $\|x\|\to \infty$ and

$$f_1(x) – \beta g_2(x)$$ is convex for some $\beta \in (0,\alpha]$.


The concrete example I have in mind is the following: $X = \mathbb R^d$ and $f_1(x) = \sum_{k=1}^d \frac{x_k^2}{\sigma_k^2}$ (a weighted squared $l^2$-norm) and $f_2(x) = \|x\|_1^2$ (the squared $l^1$-norm).

It can be shown that $f_1(x) – \alpha f_2(x) \geq 0$ for $\alpha \leq (\sum_{k=1}^d \sigma_k^2)^{-1}$.

Unfortunately, $f_1(x) – \beta\cdot f_2(x)$ is not convex for any $\beta\in (0,\alpha]$. My question in this specific example corresponds to finding a function $g_2(x)$ such that $|g_2(x)-f_2(x)|$ is "asymptotically small", e.g. $g_2(x) \geq f_2(x)$ and $|g_2(x)-f_2(x)|\to 0$ for $\|x\|_1\to\infty$. In addition for some $\beta \in (0,\alpha]$, I want

$$f_1(x) -\beta f_2(x)$$

to be convex. Note again that $f_1(x)$ is convex, and I only need convexity of $f_1 – \beta f_2$ for an arbitrarily small $\beta \in (0,\alpha)$.


PS: I have thought a bit more about this problem and it looks like the following might work (at least in this specific setting):

$g_2(x) := \left(\sum_{k=1}^d (\sqrt{\sigma_k^2 + |x_k|^2} – \sigma_k)\right)^2$

Best Answer

$\newcommand\R{\mathbb R}$In general, the answer is no.

E.g., let $d=1$, \begin{equation} f_1(x):=\sum_{j=1}^\infty(|x|-2j)_+,\quad f_2(x):=\sum_{j=1}^\infty(|x|-2j-1)_+, \end{equation} where $u_+:=\max(0,u)$ -- so that $f_1$ and $f_2$ are convex functions, $f_1(x)\to\infty$ as $|x|\to\infty$, and $f_1\ge f_2$.

However, for any $b\in(0,1)$, for the function $$h:=f_1-bf_2,$$ and for any natural $n$, we have \begin{equation} h(2n)-2h(2n+1)+h(2n+2)=-b<0. \end{equation} So, if a function $g_2$ is such that $g_2(x)-f_2(x)\to0$ as $|x|\to\infty$, then for $g:=f_1-bg_2$ we have $g(x)-h(x)\to0$ as $|x|\to\infty$, whence \begin{equation} g(2n)-2g(2n+1)+g(2n+2)<0 \end{equation} for all large enough natural $n$. So, $g$ is not convex for any $b\in(0,1)$.


In your concrete example, the answer is no as well. Indeed, let $\sigma_k=1$ for all $k$, and let $x:=(-1,u,0,\dots,0)\in\R^d$, $y:=(1,u,0,\dots,0)\in\R^d$, and $z:=(x+y)/2=(0,u,0,\dots,0)$, with $u\to\infty$. Then \begin{equation} f_1(x)-2f_1(z)+f_1(y)=2(1+u^2)-2u^2=2, \end{equation} whereas \begin{equation} f_2(x)-2f_2(z)+f_2(y)=2(1+u)^2-2u^2\to\infty. \end{equation} So, if $g_2(v)-f_2(v)\to0$ as $\|v\|\to\infty$, then for any $b\in(0,1)$ and for the function $g:=f_1-bg_2$, we have $g(x)-2g(z)+g(y)\to-\infty$, so that $g$ is not convex for any $b\in(0,1)$.

Related Question