The essence of Gaussian elimination

gaussian eliminationlinear algebra

I'm confused about the essence of Gaussian elimination. Suppose there is a linear map between V and W. $T: V\rightarrow W$. Then there can be a matrix associated with the linear map $T$. Then apply Gaussian elimination to that matrix.

Probably the result will be a new matrix, but what does this new matrix mean?

Also, does Gaussian elimination have to do with the change of basis?

(If it does, then when using Gaussian elimination to solve linear equations, how to interpret this process of solving linear equations in terms of changing basis? Also, Gaussian elimination can also be used to determine whether a list of vectors is linearly dependent, how to interpret this in terms of changing the basis?)

Best Answer

The matrix, say, $A$, depends not only on $T$ but also on the choice of bases in $V, W$. The $i$-th column of $A$ consists of coefficients of the linear combination of the basis of $W$ which is equal to the image of the $i$-th vector of the basis of $V$.

Row transformations of $A$ correspond to changes of the basis of $W$. Say, switching two rows, corresponds to switching two elements of the basis. So the Gauss reduction gives a better basis of $W$. The new matrix corresponds to the same $T$ and the two new bases (the old basis of $V$ and the new basis of $W$).

Similarly, column transformations of $A$ correspond to changes of the basis of $V$.

Related Question