Geometric or matrix intuition on $A(A + B)^{-1}B = B (A + B)^{-1} A$

geometric-interpretationintuitionmatrix equations

I am curious about a seemingly simple identity in matrix algebra. Though matrix multiplication is not commutative (the classic example of noncommutativity, it does allow a commutativity of sorts around a very specific third matrix:

$$
\color{blue}{A (A + B)^{-1} B} = \color{blue}{B (A + B)^{-1} A}
$$

There is a very simple algebraic proof:

$$
\begin{eqnarray}
\color{blue}{A (A + B)^{-1} B} + \color{red}{B (A + B)^{-1} B} &=& (A + B)(A + B)^{-1} B\\
&=& B\\
&=& B (A + B)^{-1} (A + B) \\
&=& \color{blue}{B (A + B)^{-1} A} + \color{red}{B (A + B)^{-1} B}\\
\end{eqnarray}
$$

because matrix addition is commutative. The identity follows by cancelling $B (A + B)^{-1} B$. (If there is a simpler linear proof, say, without needing cancellation, please say so. That extra weird matrix, while obviously looking right and obviously doing the job, just pops out of nowhere. Or does it?)

Algebra is blind manipulation of symbols. The identity holds in any abstract ring with multiplicative inverses. But for a given model, the identity says something about that model. I've only seen the identity in the context of matrices, but I don't see what's so special about it there.

What does this identity do for matrices?

Is there something, in matrix theory, for which this is special? Does it ever really come up in proofs? $(A+B)^{-1}$ can't be the only matrices that allow such quasi-commutativity, can they? Is there a visualization or geometric interpretation or a meaningful anything interpretation?

Best Answer

In the cases when $A,B$ and $A+B$ are all invertible, the inverse of the equality is:

$$B^{-1}(A+B)A^{-1}=A^{-1}(A+B)B^{-1}$$

But simple calculation shows the left side is $B^{-1}+A^{-1},$ and the right side is $A^{-1}+B^{-1}.$


The general case can be shown by noticing that the set of pairs $(A,B)$ such that $A,B,A+B$ are invertible is dense in the set of pairs $A,B$ such that $A+B$ is invertible. Since the function is continous, that would finish it.

We can do this by taking an arbitrary $A,B$ and then replacing it with $A+\lambda I, B-\lambda I,$ where $\lambda$ is a positive value with smaller magnitude than any of the non-zero eigenvalues of $A,B.$


That continuity argument, of course, doesn't extend to matrices over discrete fields, and this equality is true in any ring. If $R$ is a ring (with identity $I$) and $a,c,d\in R$ so that $dc=cd=I,(*)$ then:

$$ad(c-a)=(c-a)da,\tag{1}$$

because he left side is $adc-ada=a-ada$ and the right side is $cda-ada=a-ada,$ so they are equal.

Now given $a,b\in R$ so that $a+b$ is invertible in $R,$ let $c=a+b,d=(a+b)^{-1}$. Then $b=c-a$ so (1) becomes $$a(a+b)^{-1}b=b(a+b)^{-1}a$$


(*) There are rings where $cd=I$ does not imply $dc=I,$ but in square matrices, $DC=I $ means $CD=I.$ So for the general ring, we need the condition $cd=I$ and $dc=I.$

Related Question