[Math] Every local minimum of $f(x) = \frac{1}{2}x^tAx + b^tx +c$ is also a global minimum

convex optimizationderivativesmaxima-minimamultivariable-calculusoptimization

Consider the quadratic function $f(x) = \frac{1}{2}x^tAx + b^tx +c$,
where $A\in\mathbb{R}^{n\times n}$ is symmetric, $b\in\mathbb{R}^n$
and $c\in\mathbb{R}$. Let $x$ be a local minimum of $f$.
Prove that $x$ is a global minimum of $f$.

From here https://math.stackexchange.com/a/659982/166180 I know that if the hessian is positive then I know the function is convex. If we had this property, the function would have a unique minimum and therefore it'd be the global minimum because the function is convex. But we don't have any information on the hessian.

The only thing I can say is that

$$\nabla f(x) = Ax + b\\\nabla^2f(x) = A$$

since $x$ is a local minimum, $Ax+b = 0$ and $A$ is positive semidefinite. If it were positive definite, I think the problem would be solved, but it isn't. So how should I proceed?

Also where the symmetry of the matrix $A$ is used in all of this?

Best Answer

Once you know that $A$ is symmetric and positive semidefinite, there is an orthonormal basis transformation that makes it a diagonal matrix $D$ with non-negative real values in the diagonal. Another simplification is that yuo can forget about $c$.

By rewriting everything in the new basis, we have a very simple form of the function $f(x)= x^tDx+b^tx$ (with new $b$ and new variable $x$...), which is $f(x)= (d_1x_1^2 + b_1x_1)+ \ldots + (d_nx_n^2 + b_nx_n) $.

The condition that $x_0$ be a local minimum can be translated to all sections of the function: so for all $i$, $(d_1x_1^2 + b_1x_1)$ is a local minimum for the $i$-th coordinate of $x_0$.

If $d_i>0$, then we have a proper quadratic function with exactly one local minimum, and it is also a global minimum.

If $d_i=0$, then the only way $b_ix_i$ has a minimum is $b_i=0$. Nut then again, local minima and global minima coincide, as every rel number is a local and global minimum.