[Math] Inverse of a diagonal matrix plus a constant

inverselinear algebranumerical linear algebra

I am looking for an efficient solution for inverting a matrix of the following form:

$$D+aP$$

where $D$ is a (full-rank) diagonal matrix, $a$ is a constant, and $P$ is an all-ones matrix.

Inverse of constant matrix plus diagonal matrix gives a solution to the special case where all diagonal entries of $D$ are the same.

The Sherman-Morrison formula is also capable of providing a solution to this problem by setting appropriate $u$ and $v$, but it loses efficiency ($O(n^3)$ time complexity to compute) at high dimension. I am hoping to get a result in the same form so the space and time complexity are both $O(n)$.

Best Answer

What is wrong with using Sheman-Morrison? If $P$ is a matrix of all ones, then setting $$ u=(1,\dots,1)^T, \ v = a\cdot u $$ will do the trick. The matrix-vector product $(D+\alpha P)^{-1}$ can be computed in $O(n)$. The matrix $(D+\alpha P)^{-1}$ it self in $O(n^2)$.

Edit: Here is how to evaluate $(D+\alpha P)^{-1}x$. Let $e=(1,\dots,1)^T$. By Sherman-Morrison $$ \begin{split} (D+a P)^{-1}x = (D+aee^T)^{-1}x&= D^{-1}x + a\frac{D^{-1}ee^TD^{-1}}{1+ae^TD^{-1}e}x\\ &=D^{-1}x + a\frac{(D^{-1}e)(e^T(D^{-1}x))}{1+ae^T(D^{-1}e)}. \end{split}$$ The multiplication $D^{-1}y$ is $O(n)$, computing $e^Ty$ is $O(n)$, so the matrix-vector product above costs $O(n)$ operations.

To compute the inverse you can do $$ (D+a P)^{-1} = D^{-1} + a\frac{(D^{-1}e)(e^TD^{-1})}{1+ae^T(D^{-1}e)}. $$ Here the costly operation is to compute the rank-one matrix $(D^{-1}e)(e^TD^{-1})$ and to add matrices. Filling a $n\times n$ matrix in $O(n^2)$ time is optimal. After all, there are $n^2$ elements that need to be written.