[Math] Proving that a right (or left) inverse of a square matrix is unique using only basic matrix operations

inverselinear algebramatrices

Proving that a right (or left) inverse of a square matrix is unique using only basic matrix operations

— i.e. without any reference to higher-order matters like rank, vector spaces or whatever ( :)).

More precisely, armed with the knowledge only of:

  • rules of matrix equality check, addition, multiplication, distributive law and friends
  • Gauss-Jordan elimination and appropriate equation system solution cases for the reduced row-echelon form

Thanks in advance.

Best Answer

So, let us suppose that $A$ is a square matrix, and that $B$ is a matrix such that $BA=I$. You want to show that $B$ is the unique left inverse of $A$ (that is).

Note that a system $A\mathbf{x}=\mathbf{b}$ has at most one solution, namely $B\mathbf{b}$: if $A\mathbf{x}=\mathbf{b}$, then $$\mathbf{x} = I\mathbf{x} = BA\mathbf{x} = B\mathbf{b}.$$

If $CA=I$, then again a system $A\mathbf{x}=\mathbf{b}$ has at most one solution, namely $C\mathbf{b}$. Thus, $B\mathbf{b}=C\mathbf{b}$ for any $\mathbf{b}$ for which the system has a solution.

If we can show that $A\mathbf{x}=\mathbf{e}_i$ has a solution for each $i$, where $\mathbf{e}_i$ is the $i$th standard basis vector ($1$ in the $i$th entry, $0$s elsewhere) this will show that $B=C$, since they have the same columns.

Because $A\mathbf{x}=\mathbf{0}$ has a solution, that solution must be $B\mathbf{0}=\mathbf{0}$. That means that the reduced row-echelon form of $A$ is $I$. Because the reduced row-echelon form of $A$ is $I$, performing row reduction on the augmented coefficient matrix $[A|\mathbf{e}_i]$ yields the matrix $[I|\mathbf{y}]$ for some $\mathbf{y}$, with $\mathbf{y}$ being the solution to $A\mathbf{x}=\mathbf{e}_i$. Since this vector is equal to both $\mathbf{b}_i=B\mathbf{e}_i$ (the $i$th column of $B$) and to $\mathbf{c}_i=C\mathbf{e}_i$, as noted above, then the $i$th columns of $B$ and $C$ are equal; thus, $B=C$, and the matrix has a unique left inverse.

Now, let us suppose that $A$ is a square matrix and has a right inverse, $AB=I$. We want to show that $B$ is the unique right inverse of $A$. Taking transposes, we get $I = I^T = (AB)^T = B^TA^T$. By what was proven above, $B^T$ is the unique left inverse of $A^T$. If $AC=I$, then $C^TA^T=I^T = I$, so $C^T=B^T$, hence $C=B$. Thus, $B$ is the unique right inverse of $A$.