The Inverse of sum of matrices

inverselinear algebramatricesmatrix decompositionmatrix-calculus

Assume to have a Positive Semi Definite matrix $A\in\mathbb{R}^{d\times d}$, defined as

$$A = \sum_{i=1}^n \alpha_i x_i x_i^\top = \sum_{i=1}^n A_i$$

such that $\alpha_i \in [0,1]: \sum_{i = 1}^n \alpha_i = 1$, $x_i \in \mathbb{R}^d$, $n>d$.

Further, assume that $A$ is a singular matrix, that is equivalent to the condition $\lambda_{min}(A) = 0$.

It is known that $\lambda_{min}(A) \geq \sum_{i = 1}^n\lambda_{min}(A_i)$, thus even when all matrices $A_i$ are singular, $A$ can be invertible.

A trivial example for $n = 2$ would be
$A_1 = \begin{pmatrix}
1 & 0\\
0 & 0
\end{pmatrix}$
, $A_2 = \begin{pmatrix}
0 & 0\\
0 & 1
\end{pmatrix}$
, that are singular but whose sum is non-singular.

What are the necessary (and possibly sufficient) conditions on the matrices $A_i$ and on the vectors $x_i$ under which $A$ is singular?

Best Answer

Let $i_1,\dots,i_k$ denote the values of $i$ for which $a_{i} \neq 0$. Let $D$ denote the diagonal matrix $$ D = \pmatrix{a_{i_1} \\ & \ddots \\ && a_{i_k}}. $$ Your matrix can be written as $A = XDX^T$, where $X$ is the matrix with columns $x_{i_1},\dots,x_{i_k}$.

The rank of this matrix is necessarily equal to the rank of $X$. In particular, $X$ has the same rank as $M = XD^{1/2}$, and $M$ necessarily has the same rank as $A = MM^T$. Thus, $A$ is singular if and only if the columns of $X$ fail to span $\Bbb R^d$, which occurs iff the rows of $X$ must be linearly independent.

Related Question