Linear Algebra – Determinant of Rank-One Perturbation of a Diagonal Matrix

determinanteigenvalues-eigenvectorslinear algebramatrices

Let $A$ be a rank-one perturbation of a diagonal matrix, i. e. $A = D + s^T s$, where $D = \DeclareMathOperator{diag}{diag} \diag\{\lambda_1,\ldots,\lambda_n\}$, $s = [s_1,\ldots,s_n] \neq 0$. Is there a way to easily compute its determinant?

One the one hand, $s^Ts$ has rank one so that it has only one non-zero eigenvalue which is equal to its trace $|s|^2 = s_1^2+\cdots+s_n^2$. On the other hand, if $D$ was a scalar operator (i.e. all $\lambda_i$'s were equal) then all eigenvalues of $A$ would be shifts of the eigenvalues of $s^T s$ by $\lambda$. Thus one eigenvalue would be equal to $\lambda+|s|^2$ and the others to $\lambda$. Hence in this case we would obtain $\det A = \lambda^{n-1} (\lambda+|s|^2)$. But is it possible to generalize these considerations to the case of diagonal non-scalar $D$?

Best Answer

You do not need the diagonal entries of $D$ to be positive. Since $\det(I+AB)=\det(I+BA)$ we have $\det(I+xy^T)=1+y^Tx$ (which is not hard to prove directly). If $D$ is invertible, then $$ \det(D+ss^T) = \det(D(I+D^{-1}ss^T)) = \det(D) \det(I+D^{-1}ss^T) = (1+s^TD^{-1}s) \det(D). $$ This gives $$ \det(D+ss^T) = \det(D) \left(1+\sum_r \frac{s_r^2}{d_r}\right) $$ If two diagonal entries of $D$ are zero then $\det(D+ss^T)=0$. If just one is zero, $D_{n,n}$ say, then $$ \det(D+ss^T) = s_n^2\prod_{r=1}^{n-1} D_{r,r}. $$ If you define $\delta_r=\det(D)/D_{r,r}$ we have $$ \det(D+ss^T) = \sum_r \delta_r s_r^2, $$ which is neater and holds in all cases.

Related Question