[Math] Special considerations when using the Woodbury matrix identity numerically.

linear algebramatricesna.numerical-analysis

Are there any special considerations when using the Woodbury matrix identity numerically? What is the best metric for numerical stability in this case? Can anyone point me to a good reference?

The special case of the identity that I'm using is:
$ (A + UBU^T)^{-1} = A^{-1} – A^{-1}U(B^{-1} + U^TA^{-1}U)^{-1}U^TA^{-1}$

I'm using the identity to speed up a matrix inverse. In my case $A$ is diagonal, and $U$ is rectangular by a factor that varies between 10 and 200. I'm using a sparse form for $U$.

This is failing catastrophically for me. Normal inversion seems stable for condition number of the LHS well beyond $10^{10}$, while the identity is failing around condition number of $10^7$ (everything in double precision). However, condition number of the LHS doesn't seem to be the best metric for when things fail, as I can arrange cases where the failure happens at much higher or lower condition number.

Also, if there is another way to compute this inverse fast, I'm open to suggestions.

Thanks in advance!

Update:

To answer a few questions: yes the dimensions of $B$ are much less than the dimensions of $A$ and both are symmetric and positive definite.

I've checked and the matrices $A$, $B$, and $(B^{-1} + U^TA^{-1}U)$ all have relatively low condition number.

I've also done the following: I've factored the identity to:
$(A + UBU^T)^{-1} = A^{-1}[I – U(B^{-1} + U^TA^{-1}U)^{-1}U^TA^{-1}]$

Then one would expect this fail when the second term in square brackets has eigenvalues smaller or equal to $-1$, because adding the identity just translates the eigenvalues by $+1$. This is indeed the case. The translation of the eigenvalues seems to be working fine, which tells me that there is no problem with taking the difference of the two terms. This issue must be in computing the second term.

Further Update:

This reference appears to address the problem. I haven't gotten all the way through it yet but it looks like you can precondition the problem by making sure that the columns of $U$ are all mutually orthogonal. After reading it, it turns out this doesn't help, since this only helps with the condition number of $(B^{-1} + U^TA^{-1}U)$, which I don't think is the problem.

Best Answer

Have you tried checking the condition number of the three matrices that you invert in the RHS, instead of the LHS? I am afraid that's the only sure way to tell.

I've heard in many conferences that many experts in matrix computations (led by the late Gene Golub) consider the SMW identity to be "numerically dangerous"; I am not sure that I can find a paper where this is stated black on white, though (sorry for that, I was always curious about that myself).

Alternative idea: switch to solving with the extended matrix $\begin{bmatrix}A & U\\\\ U^* & B\end{bmatrix}$ --- this could be stabler.

Related Question