The fastest algorithm to solve an equality-constrained convex quadratic program

convex optimizationnumerical optimizationoptimizationquadratic programming

I am trying to solve the following convex problem

$$\begin{array}{ll} \text{minimize} & x^T Q x + px\\ \text{subject to} & Ax + b = 0\end{array}$$

which arises in the active set method for inequality/equality constrained quadratic optimization).

I have studied the conjugate gradient method, Newton's method with equality constraints, or solving the KKT matrix directly. Which one do you think is fastest? I need the algorithm to converge to high accuracy and be as fast as possible, and I have to notice that I am going to use it for polishing the solution obtained by the other solver so it may converge after a few iterations.

Best Answer

Assuming a dense system, solving the KKT system gives the solution as $$ \begin{bmatrix} Q & A^*\\A & 0 \end{bmatrix}\begin{bmatrix} \mathbf{x}^\star\\\boldsymbol{\nu}^\star \end{bmatrix} = -\begin{bmatrix} \mathbf{p}\\\mathbf{b} \end{bmatrix} $$ which can be solved accurately by an $LDL^*$ factor-solve method in $\tfrac{1}{3}n^3 + 2n^2$ flops. This is equivalent to running Newton for one step, so no need to compare there. Conjugate gradients is numerically unstable so would perform worse unless potentially your system was very large or sparse.