[Math] Conjugate Gradient for a “slightly” singular system.

linear algebramatricesna.numerical-analysis

Suppose I have a symmetric $N \times N$ matrix A which has a one-dimensional Nullspace $N$. A is positive definite on $N^\bot$. In my case $N$ is the space of constant vectors (i.e. generated by the all-one vector).

I have to solve the problem $Ax = b$, with $b \in R(A)$ which has infinitely many solutions. I am looking for the minimum norm solution. The matrix $A$ is very large and sparse, direct methods aren't really an option. The rank-deficient least squares algorithms I have seen also appear to be prohibitive.

I was solving a non-singular version of this problem with Conjugate Gradient. Is there anyway I can modify the algorithm to solve this particular problem?

EDIT: Boiling the problem down to the bone the question is if $A$ is positive semi-definite with exactly one 0 eigenvalue, does CG work?

Thanks.

Best Answer

I'd suggest you to shift away the singularity: solve $(A+ee^T)y=b$ instead and then orthogonalize $y$ with respect to $e$ to get $x$. $A+ee^T$ is not sparse but you can compute matrix-vector products cheaply, and that's all you need for CG.

EDIT: forgot to define it, $e$ is the vector of all ones

Related Question