Does increasing differences imply convexity of a function

convex-analysisexamples-counterexamplesreal-analysis

If function $f: \mathbb{R} \to \mathbb{R}$ is convex, then it has increasing differences: for all $y \geq x$ and $t \geq 0$,
$$
f(y+t) – f(y) \geq f(x + t) – f(x).
$$

This follows easily by writing both $x+t$ and $y$ as convex combinations of $x$ and $y+t$, using convexity of $f$ and adding the corresponding inequalities.

But what about the converse: does having increasing differences imply that $f$ is a convex function?

What I tried: I can establish midpoint convexity. Let $a, b \in \mathbb{R}$. W.l.o.g. $a \leq b$. Increasing differences with $x = a$, $y = (a+b)/2$, and $t = y-a = (b-a)/2$ gives
$$
f(b)-f\left(\frac{a+b}{2}\right) \geq f\left(\frac{a+b}{2}\right) – f(a),
$$

so that
$$f\left(\frac{a+b}{2}\right)\leq \frac{1}{2}f(a) + \frac{1}{2} f(b),
$$

the requirement for midpoint convexity. From this, a standard argument allows one to extend it to convex combinations $\lambda a + (1 – \lambda) b$ for rational $\lambda$ in the unit interval, but I don't see if it holds for all real $\lambda$.

Best Answer

There exist discontinuous additive functions $f:\mathbb R \to \mathbb R$. Since convex functions are necessarily continuous it follows that these functions are not convex. But they obviously satisfy the hypothesis.