[Math] Anybody knows a proof of Uniqueness of the Reduced Echelon Form Theorem

linear algebramatrices

The book has no proof showing each matrix is row equivalent to one and only one reduced echelon matrix. Does anybody know how to prove this theorem?

"Theorem Uniqueness of the Reduced Echelon Form
Each matrix is row equivalent to one and only one reduced echelon matrix"
Source: Linear Algebra and Its Applications, David, C. Lay.

[EDIT I think the following can be a proof that each echelon matrix is reduced to only one reduced echelon matrix, but how to show a matrix that is not in echelon form is reduced to only one reduced echelon matrix?]

In a $m×n$ matrix in echelon form of a linear system for some positive integers m, n, let the leading entries $(■)$ have any nonzero value, and the starred entries $(☆)$ have any value including zero.

Leading entries $■$s in $R_1$ and $R_2$ in an echelon matrix can become leading 1 in a reduced echelon matrix through dividing them by $■$, and the entry ☆ in $R_1$ above $■$ in $R_2$ can be $0$ by subtracting a multiple of $■$.

So $R_1$ and $R_2$ in a matrix in echelon form becomes as follows:
$\begin{array}{rcl}
R_1\space & [■ ☆\cdots ☆☆☆☆]\\
R_2\space & [0 ■\cdots ☆☆☆☆]\end{array} \qquad ~
\begin{array}{rcl} R_1\space & [1 0\cdots ☆☆☆☆]\\R_2 &[0 1\cdots ☆☆☆☆]
\end{array}$

For all integers k with $2≤k<m$, $R_k$, $R_{k+1}$ in the echelon matrix can be expressed as
$R_{k}\space$ $[0 \cdots 0 ■☆☆\cdots ☆]$
$R_{k+1}$ $[0 \cdots 0 0 ■☆\cdots ☆]$.

Subtracting a multiple of leading entry of $R_{k+1}$ from $R_k$ can make the entry above leading $■$ in $R_{k+1}$ be zero, and the leading $■$s in $R_k$, $R_{k+1}$ can be 1 through dividing the rows by leading entry $■$s.

So the rows in echelon matrix become the following in reduced $m×n$ echelon matrix:
$\begin{array}{rcl}
R_{k}\space & [0 \cdots 0 ■☆☆\cdots ☆]\\
R_{k+1} & [0 \cdots 0 0 ■☆\cdots ☆]\\
\end{array} \qquad
\begin{array}{rcl}
R_{k} & [0 \cdots 0 1 0 ☆\cdots ☆]\\
R_{k+1} & [0 \cdots 0 0 1 ☆\cdots ☆]\\
\end{array}$

Hence, it's found that leading 1s in reduced echelon form of $m×n$ matrix of a linear system correspond to the locations of the leading non-zero values in a $m×n$ matrix in echelon form of the linear system.

Best Answer

When you perform elementary row operations, then linear relations between columns don't change. More precisely, if $A=[a_1\;a_2\;\dots\;a_n]$ is a matrix (given with its columns $a_i$) and $B$ is obtained by $A$ by performing elementary row operations, if $1\le i_1,i_2,\dots,i_j,p\le n$ are column indices, then $$ b_p=\alpha_1b_{i_1}+\alpha_2b_{i_2}+\dots+\alpha_jb_{i_j} \quad\text{if and only if}\quad a_p=\alpha_1a_{i_1}+\alpha_2a_{i_2}+\dots+\alpha_ja_{i_j} $$ This follows from the fact that performing row operations is the same as multiplying $A$ on the left by invertible matrices. So $B=FA$ for some invertible matrix $F$ and proving the above results is easy.

In particular, a set of columns of $B$ is linearly independent if and only if the corresponding set of columns of $A$ is linearly independent.

If $U=[u_1\;u_2\;\dots\;u_n]$ is a reduced row echelon form for $A$, let $u_{i_1},u_{i_2},\dots,u_{i_j}$ be the pivot columns, with $i_1<i_2<\dots<i_j$.

Note that if $i_j<p<i_{j+1}$, then $u_p$ is a linear combination of $u_{i_1},\dots,u_{i_j}$ and the coefficients of the linear combination are determined by $u_p$: if $$ u_p=\begin{bmatrix}\alpha_1\\\alpha_2\\\vdots\\\alpha_m\end{bmatrix} $$ then $\alpha_i=0$ if $i>i_j$ and $$ u_p=\alpha_1u_{i_1}+\alpha_2u_{i_2}+\dots+\alpha_{i_j}u_{i_j} $$ Therefore $$ a_p=\alpha_1a_{i_1}+\alpha_2a_{i_2}+\dots+\alpha_{i_j}a_{i_j} $$ and this linear combination is unique, because the set of columns $$ \{a_{i_1},a_{i_2},\dots,a_{i_j}\} $$ is linearly independent.

Thus the position of the pivot columns in $U$ is uniquely determined by the columns of $A$ and the coefficients on the non-pivot columns are likewise determined by the linear relations between the columns of $A$.