[Math] Prove every local minimum is a global minimum

convex-analysislinear algebranonlinear optimizationoptimizationreal-analysis

Let $Q\in\mathbb{R^{dxd}}$ and $A\in\mathbb{R^{d'xd}}$ be two matrixes and $b\in\mathbb{R^d}$, $c\in\mathbb{R^{d'}}$. Suppose $d'\lt d $. For $x\in\mathbb{R^d}$.

Minimize
$$f(x)= \frac{1}{2}x^TQx-b^Tx\qquad \text{subject to }Ax=c.$$
Prove that every local minimum is also a global minimum.

Edit ok heres where ive gotten so far,:

Suppose there exists a point $x_1$ such that $f(x_1)\lt f(x_0)$ for $x_0$, a minimum.

Consider the function b(s) = (1/2)x(s).Qx(s) -b.x(s) where x(s)=(1-s)$x_0$ + $x_1$s,

I want to show that the gradient of b(0) is zero but im having trouble simplifying the gradient of this equation b(s)=$(1/2)\sum_{j=1}^d x_j(s) \sum_{i=1}^d Q_{ij}x_i(s)$.

Best Answer

I don't understand why you want to consider $b(s)$. To show that every local minimum is a global minimum, you may proceed as follows. First, if $Ax=c$ has a solution $x_0$, then the set of solutions to $Ax=c$ are given by $x=x_0+Ky$ where the columns of $K$ form a basis of $\ker A$. For such $x$, $$ f(x)=\frac12(x_0+Kx)^TQ(x_0+Kx)-b^T(x_0+Kx)=\frac12y^TK^THKy-u^Ty+r=g(y) $$ for some constant vector $u$ and some constant scalar $r$, and $H$ is the symmetric part of $Q$. So, the constrained minimization of $f$ is equivalent to the unconstrained minimization of $g$, which is a usual least square problem. Since all local minima are global minima in a usual least square problem, the same also holds in the constrained case. (Note, however, that this doesn't mean a local/global minimum exists. In the nontrivial case, a local minimum of $g$ exists if and only if $Ax=c$ is solvable and $K^THK$ is positive semidefinite.)

Related Question