Linear Algebra – If Product of Two Non-Zero Square Matrices is Zero, Both Must Be Singular

linear algebra

In the textbook Contemporary Linear Algebra by Anton and Busby, there was a small question in section 3.2 page 101 concerning this. It asks if $A$ and $B$ are two non-zero square matrices such that $AB=0$, then $A$ and $B$ must both be singular. Why is this so?

I can prove that if $A$ is non-singular then $B=I_nB=A^{-1}AB=0$, implying $B$ must be the zero matrix which is a contradiction. Similarly if $B$ is non-singular, then $A$ must be the zero matrix. Hence, both must be singular. But this doesn't really answer why, it just shows a contradiction for any case and hence must be the negation of our supposition that at least one is non-singular.

I would like to know the essence and inherent property as to why they must be both singular (and why can't it be the case that only one is singular?) and what is the motivation for such a conclusion?

Best Answer

As Thomas points out, your proof is fine, but if you want another way to look at it, consider the following:

Suppose $AB = 0$. What is the $j$-th column on either side of this equation? On the left, it is a linear combination of the columns $\{\mathbf a_j\}$ of $A$, with coefficients from the $j$-th column of $B$, and on the right is the 0 vector:

$$b_{1j}\mathbf a_1 + b_{2j} \mathbf a_2 + \cdots + b_{nj}\mathbf a_n = \mathbf 0$$

This is true for each $j$, and there must be at least one non-zero $b_{ij}$ coefficient, since $B\neq 0$, so the columns of $A$ are linearly dependent.

Similarly, we can ask what are the rows on each side of the equation? The $i$-th row is a linear combination of the rows of $B$, with coefficients from the $i$-th row of $A$. So you see that the rows of $B$ must be linearly dependent.