Large and ill-conditioned quadratic convex problem

numerical linear algebra

I need to solve a convex quadratic problem numerically:

$\min f(x) = \frac{1}{2} x^\top A x – b^\top x$,

where $A$ is a very large and ill-conditioned semi positive definite matrix. Typical conjugate gradient method doesn't work well. SGD is too slow.

I'm looking for a good numerical method with relatively less computational effort.

Any experience or suggestions? Thank you!

Best Answer

I assume you are looking to minimize $f(x)$, and that $A$ is symmetric and positive semi-definite.

  • If $b \notin Null(A)^{\perp}$ then there is a vector $v \in Null(A)$ such that $b^Tv \neq 0$. Without loss of generality assume $b^Tv >0$ (else multiply $v$ by $-1$). Then for all real numbers $\theta$ we get: $$ f(\theta v) = 0 + (\theta)(-b^Tv)$$ so $\lim_{\theta\rightarrow\infty} f(\theta v) = -\infty$ and no minimizer exists.

  • Else, $b \in Null(A)^{\perp} = Col(A^T) = Col(A)$. Just find any vector $r$ such that $Ar=b$. Then: \begin{align} \frac{1}{2}(x-r)^TA(x-r) &= \frac{1}{2}x^TAx - x^TAr + \frac{1}{2}r^TAr \\ &= f(x) + \frac{1}{2}r^TAr \end{align} and this is clearly minimized at $x^*=r$, so $$ f(x^*) = f(r) = -\frac{1}{2}r^TAr$$


In other words, the problem reduces to Gaussian elimination: Find a vector $r$ such that $Ar=b$. If no such $r$ exists then no minimizer exists.