Proof That Gauss-Jordan Elimination Works – Linear Algebra

gaussian eliminationinverselinear algebramatrices

Gauss-Jordan elimination is a technique that can be used to calculate the inverse of matrices (if they are invertible). It can also be used to solve simultaneous linear equations.

However, after a few google searches, I have failed to find a proof that this algorithm works for all $n \times n$, invertible matrices. How would you prove that the technique of using Gauss-Jordan elimination to calculate matrices will work for all invertible matrices of finite dimensions (we allow swapping of two rows)?

Induction on $n$ is a possible idea: the base case is very clear, but how would you prove the inductive step?

We are not trying to show that an answer generated using Gauss-Jordan will be correct. We are trying to show that Gauss-Jordan can apply to all invertible matrices.

Note: I realize that there is a similar question here, but this question is distinct in that it asks for a proof for invertible matrices.

Best Answer

This is one of the typical cases where the most obvious reason something is true is because the associated algorithm cannot possibly fail.

Roughly speaking, the only way Gauss-Jordan can ever get stuck is if (at any intermediate point) there is a column containing too many zeroes, so there is no row that can be swapped in to produce a non-zero entry in the expected location. However, if this does happen, it is easy to see that the matrix is non-invertible, and since the row operations did not cause this, it must have been the original matrix that is to blame.

Related Question