Convex Optimization – Proving Convexity of a Function with Positive Semidefinite Hessian

continuityconvex optimizationderivativesoptimization

Let $C$ be a convex set in $\mathbb{R}^n$ and let $f:{\mathbb{R}}^n \rightarrow \mathbb{R}$ be twice continuously differentiable over $C$.

The Hessian of $f$ is positive semidefinite over $C$, and I want to show that $f$ is therefore a convex function.

I am currently trying to apply Taylor's Theorem to replace $f(x)$ with an expression that includes its Hessian.

Best Answer

I'm assuming what you would like to show is that if $f$ has positive semidefinite Hessian, then for all $\mathbf{x}, \mathbf{y}$ in the domain, and $t \in [0,1]$, we have: $$ f(t \mathbf{x} + (1-t) \mathbf{y}) \le t f(\mathbf{x}) + (1-t) f(\mathbf{y}) $$ To reduce it to the one-dimensional case, fix $\mathbf{x}$ and $\mathbf{y}$ and look at the function restricted to the line segment connecting those points. That is, define the one-dimensional function: $$g(t) = f(t \mathbf{x} + (1-t) \mathbf{y})$$ Then we can compute the derivatives of $g$: $$g'(t) = ( \mathbf{x} - \mathbf{y})^T \mathbf{\nabla}f(t \mathbf{x} + (1-t) \mathbf{y})$$ $$g''(t) = ( \mathbf{x} - \mathbf{y})^T \mathbf{\nabla^2}f(t \mathbf{x} + (1-t) \mathbf{y}) ( \mathbf{x} - \mathbf{y})$$ Since the Hessian is positive semidefinite, we have $g''(t) \ge 0$ for all $t$. Then we use this with Taylor's theorem to prove that: $$ \begin{aligned} g(0) &\ge g(t) + g'(t)(-t)\\ g(1) &\ge g(t) + g'(t)(1-t) \end{aligned} $$ Then if $t \in [0,1]$, these can then be combined to give: $$ g(t) \le tg(1) + (1-t)g(0) $$ which is equivalent to the inequality we wanted to prove.

Related Question