[Math] Is there Lipschitz property for subdifferential

convex optimizationconvex-analysisoptimization

I'm trying to bound the quantity
$\langle \nabla \Psi(x),\bar{x}-x \rangle$ above, with the bound depending on $\|x-\bar{x}\|$ and perhaps also of $\|x-y\|$ for fixed (but not varying) points $y$. Where here $\Psi(x):X\mapsto \mathbb{R}$ with $X$ a finite dimensional Banach space (or $\mathbb{R}^n$, whatever)
And $\Psi(x)$ is a $\mu$-strongly convex function (with $\mu$>0) that can be written as $\Psi=f+g$ with $f$ convex and differentiable with $\nabla f$ $L$-Lipschitz continuous and $g$ $\mu$-strongly convex.

I know that if $\Psi$ was differentiable and its gradient was $L$-Lipschitz continuous one could fix some point $x^*$ on the optimal set and bound as

$\langle \nabla \Psi(x), \bar{x}-x \rangle \leq \|\nabla \Psi(x)\|\|\bar{x}-x\| = \|\nabla \Psi(x)-\nabla \Psi(x^*)\|\|\bar{x}-x\| \leq L\|x-x^*\|\|\bar{x}-x\|$

And the bound is done.
So my question is, is there an analogous of this property on the non-differentiable case? Like, I know that I can pick a point $x^*$ on the optimal set such that $0 \in \partial \Psi(x^*)$, but then can I say that for a $v \in \partial \Psi(x)$ it holds

$\|v\| = \|v-0\| \leq L\|x-x^*\|$ or something on that line?

Any help is appreciated

Best Answer

The answer is no. On the real line consider $\Phi(x)=|x| $ (and add some smooth convex function with minimum in zero if you like). Then the minimum is in zero but the subgradient at any positive point is about 1.

Related Question