AC = BC implies A = B when C has full row rank

linear algebrasolution-verification

Let A, B, C be matrices, Is it true that AC = BC implies A = B when C has full row rank, or alternatively if we take the transpose, CA=CB and C has full column rank implies A=B?

I think I have come up with proof for the above statement. I'll explain the main idea for simplicity. Let C be a matrix with m rows and n columns since it has full row rank, $n \geq m$. Therefore, we can separate C into two submatrices, each containing columns of C. The former submatrices $C_{0}$ contain any m columns that are linearly independent, while the latter contains the rest. Thus we have $AC_{0} = BC_{0}$, multiplying by $C_{0}^{-1}$ on both side, we arrive at $A = B$, which is what we want to prove.

I was thinking about this while learning linear programming, and it's pretty odd that I did not find a post asking for this. Would you mind letting me know if the above statement is correct?

Best Answer

Yes, it's true. Here's the argument I would offer. It's essentially equivalent to yours, although a slight bit more elementary. (I don't need to invoke the fact that row rank equals column rank.)

Since $C$ has full row rank, every vector in $\Bbb R^m$ can be written as $Cx$ for some $x\in\Bbb R^n$. In particular, each standard basis vector $e_j\in\Bbb R^m$ can be written as $e_j = Cx_j$ for some $x_j\in\Bbb R^n$. So suppose that $(A-B)C=0$. It follows that $0=(A-B)Cx_j = (A-B)e_j$ for all $j=1,\dots,m$. This means that the standard matrix of $A-B$ is the $0$-matrix.