[Math] Right Inverse of a matrix

linear algebramatricesreference-request

I'm reading Linear Algebra by Bill Jacob and am having trouble with his development of the theory behind the right inverse of a matrix. I did an internet search but didn't find anything useful. Does anyone know of a reference that might explain the concept a little more clearly?

Best Answer

To find a left inverse of a square matrix, you would perform row operations, and keep account of each operation by performing the corresponding operation to the identity matrix, and multiplying the result on the left of the given matrix. Each row operation is applied to the left and the composition of these (applied to the identity matrix) is a right inverse for your given matrix provided the reduced row echelon form is the identity.

By analogy a right inverse would correspond to a sequence of column operations. You can figure out what a column reduction is even if it doe not make algorithmic sense to do this. Later you will learn that the right inverse is the left inverse. Try for a few small (2x2) or (3x3) examples.

Related Question