Hoffman and Kunze Chapter 2 Theorem 11 Clarification

linear algebra

I'm a bit confused by the proof of Theorem 11 in chapter 2 of the Hoffman and Kunze Linear Algebra book:

Theorem 11. Let $m$ and $n$ be positive integers and let $\mathbb{F}$ be a field.
Suppose $W$ is a subspace of $\mathbb{F}^n$ and $\dim(W)\leq m$. Then there is precisely one
$m\times n$ row-reduced echelon matrix over $\mathbb{F}$ which has $W$ as its row space.

Proof. There is at least one $m\times n$ row-reduced echelon matrix
with row space $W$… (I understand this part)

Now let $R$ be any row-reduced echelon matrix which has $W$ as its row
space. Let $\rho_1,\dots,\rho_r$ be the non-zero row vectors of $R$ and suppose that
the leading non-zero entry of $\rho_i$ occurs in column $k_i$, $i=1,\dots,r$. The
vectors $\rho_1,\dots,\rho_r$ form a basis for $W$ (Theorem 10). In the proof of Theorem 10, we
observed that if $\beta=(b_1, \dots , b_n)$ is in $W$, then the unique expression of $\beta$ as a linear combination of $\rho_1,\dots,\rho_r$ is
$$\beta=\sum_{i=1}^{r}b_{k_i}\rho_i.$$
Thus any vector $\beta$ is determined if one knows the coordinates $b_{k_i}$, $i=1,\dots,r$.

Suppose $\beta\in W$ and $\beta\ne0$. We claim the first non-zero coordinate
of $\beta$ occurs in one of the columns $k_s$. Since
$$\beta=\sum_{i=1}^{r}b_{k_i}\rho_i$$
and $\beta\ne0$, we can write
$$\beta=\sum_{i=s}^{r}b_{k_i}\rho_i\;,\;b_{k_s}\ne0.$$
From the fact that $R$ is a row-reduced echelon matrix one has $R_{ij}=0$ if $i>s$ and $j\leq k_s$. Thus
$$\beta=(0,\dots,0,b_{k_s},\dots,b_n)\;,\;b_{k_s}\ne0$$
and the first non-zero coordinate
of $\beta$ occurs in one of the columns $k_s$.

It is now clear that $R$ is uniquely determined by $W$. The description
of $R$ in terms of $W$ is as follows. We consider all vectors $\beta = (b_1, \dots , b_n)$
in $W$. If $\beta\ne0$, then the first non-zero coordinate of $\beta$ must occur in some
column $t$:
$$\beta=(0,\dots,0,b_t,\dots,b_n)\;,\;b_t\ne0.$$

Let $k_1, \dots , k_r$ be those positive integers $t$ such that there is some $\beta\ne0$
in $W$, the first non-zero coordinate of which occurs in column $t$. Arrange
$k_1, \dots , k_r$ in the order $k_l < k_2 < \dots < k_r$. For each of the positive
integers $k_s$ there will be one and only one vector $\rho_s$ in $W$ such that the
$k_s$th coordinate of $\rho_s$ is 1 and the $k_i$th coordinate of $\rho_s$. is 0 for $i\ne s$. Then $R$ is the $m\times n$ matrix which has row vectors $\rho_1, \dots , \rho_r, 0, \dots , 0$
.

I understand that there will be at most one such vector $\rho_s$ since "any vector $\beta$ is determined if one knows the coordinates $b_{k_i}$, $i=1,\dots,r$", but how do we know there's one in $W$ at all? Is it because "there's at least one $m\times n$ row-reduced echelon matrix
with row space $W$"? Also, why is it "now clear that $R$ is uniquely determined by $W$"? Thanks.

Best Answer

how do we know there's one in W at all? Is it because "there's at least one m×n row-reduced echelon matrix with row space W"?

Yes, you're right. There's at least one $m \times n$ row-reduced echelon matrix $R$ and we defined each $\rho_s$ as a non-zero row in $R$. $W$ is the row space of $R$ so each $\rho_s$ has to be in $W$.

Also, why is it "now clear that R is uniquely determined by W"?

This part follows by uniqueness of each $\rho_s$. Like I said before, each $\rho_s$ corresponds to a non-zero row in $R$. Now that we know that each $\rho_s$ is uniquely determined by $W$, the non-zero rows of $R$ are also unique, hence $R$ is uniquely determined by $W$.