[Math] Local minima are global for convex functions

convex optimizationfunctional-analysisoptimizationtopological-vector-spaces

Let $f:\mathbb R^n\to \mathbb R$ be a convex function.

Show that any local minimum point is also a global minimum point.

I've seen a proof by contradiction here(end of pp.2), but I'd like to prove statements directly whenever I can.

The idea is simple, because $x_0$ is a local minimum point, we have a neighborhood $V$ of $x_0$ satisfying
$$\forall x\in V, f(x_0)\leq f(x) $$
Pick $x\in\mathbb R^n$ and take the convex combination s.t $\lambda x_0 + (1-\lambda) x\in V$, then
$$f(x_0) \leq f(\lambda x_0 + (1-\lambda) x) \leq \lambda f(x_0) + (1-\lambda)f(x)\xrightarrow[\lambda\to 0]{}f(x) $$
The limit exists because addition of vectors and multiplication with scalar are continuous operations. But surely, when $\lambda \to 0$ we potentially exit the set $V$. Are we allowed to conclude $f(x_0)\leq f(x)$? If so, why isn't it a problem that the convex combinations along the path $\lambda \to 0$ aren't all in $V$?

Best Answer

Take any $x \in \mathbb{R}^n$ and choose $\lambda \in (0,1)$ such that $y:= \lambda x_0 + (1-\lambda)x \in V$. Then $f(y) \leqslant \lambda f(x_0) +(1-\lambda)f(x)$ by convexity. But $f(x_0) \leqslant f(y)$, so we have $f(y) \leqslant \lambda f(y) + (1-\lambda)f(x)$, thus $f(y) \leqslant f(x)$, so $f(x_0) \leqslant f(x)$.

Related Question