[Math] The local minimum of the SQP (sequential quadratic programming) algorithm

optimization

Consider the constrained optimization problem

\begin{eqnarray}
goal~~&&\min f(x)\\
s.t.~~&&g_1(x)\leq0\\
&&g_2(x)\leq0\\
&&\cdots\\
&&g_n(x)\leq0
\end{eqnarray}
where $x$ is a vector variable.

An efficient method to solve this problem is SQP(sequential quadratic programming) algorithm, however, in some cases, we may get the local minimum by using this problem.

And my question is: are there any modified SQP algorithms or other methods which can always get the global minimum instead of the local minimum? Could you give me some references? Thanks.

Best Answer

Without any additional assumptions on the structure of the problem there are no methods which guarantee the global optimum. However, if your problem is convex then any local minimum is a global minimum. To quote Wikipedia, "many optimization problems can be reformulated as convex minimization problems". A great reference to convex optimization, and optimization in general, is the book by Boyd and Vandenberghe which is freely available for download on the internet.

Related Question