Show that for strictly convex functions the subgradients $v$ satisfy $f(x)>f(x_0)+v(x-x_0)$

convex-analysisnon-smooth-analysissubgradient

I am beginning to study convex optimization and I found the following problem.

Let $E \subset \mathbb{R}^n$ be a convex set and $f : E \to \mathbb{R}$ a convex, non-necessarily differentiable function. We defined the subgradient of $f$ at $x_0$ to be a vector $v\in \mathbb{R}^n$ such that $$f(x)\geq f(x_0)+v^t(x-x_0), \quad\forall x\in E$$

Later in a proof the professor uses that if $f$ is strictly convex, then the previous inequality is strict, which I am failing to see. Any ideas on how to prove this?

Best Answer

It is proved in Optimierungsmethoden (Jungnickel, 2.ed., 2008) Theorem 3.1.12 (page 86):

By Definition of the subgradient in the point $x_0\in \mathbb{R}^n$ it holds for all $x\neq x_0$ that $f(x)\geq f(x_0)+v^t(x-x_0)$. We assume that there is a $x\neq x_0$ with $f(x)=f(x_0)+v^t(x-x_0)$ and show a contradiction. Thus $f(x)>f(x_0)+v^t(x-x_0)$ has to hold.

By strict convex we get for all $t\in]0,1[$ $$f(tx_0+(1-t)x)<tf(x_0)+(1-t)f(x)=f(x_0)+(1-t)v^t(x-x_0). $$ For the point $tx_0+(1-t)x\in E$ we get by the subgradient condition in the point $x_0$ $$f(tx_0+(1-t)x)\geq f(x_0)+v^t(tx_0+(1-t)x-x_0)=f(x_0)+(1-t)v^t(x-x_0) .$$ This is a contradiction to the equation before.

Related Question