Linear Algebra – Hint to Efficiently Prove a Statement About a Linear Map

linear algebralinear-transformationslogicvector-spaces

Exercise.

Suppose $T \in \mathcal{L}(V)$, and $v_1, \ldots, v_n$ and $u_1, \ldots, u_n$ are bases of $V$. Prove that the following are equivalent.

$\text{(a) }$ $T$ is injective.

$\text{(b) }$ The columns of $\mathcal{M}(T)$ are linearly independent in $\mathbf{F}^{n, \ 1}$.

$\text{(c) }$ The columns of $\mathcal{M}(T)$ span $\mathbf{F}^{n, \ 1}$.

$\text{(d) }$ The rows of $\mathcal{M}(T)$ span $\mathbf{F}^{1, \ n}$.

$\text{(e) }$ The rows of $\mathcal{M}(T)$ are linearly independent in $\mathbf{F}^{1, \ n}$.

Here $\mathcal{M}(T)$ means $\mathcal{M}(T, (v_1 ,\ldots, v_n), (u_1, \ldots, u_n))$.


Source.

Linear Algebra Done Right, Sheldon Axler, 4th Edition, Section 3C, Exercise 17.


Notation.

  • $\mathcal{L}(V)$ is the set of all linear mappings from a vector space $V$ to itself.
  • $\mathcal{M}(T)$ represent the matrix of a linear map $T$.
  • $\mathbf{F}$ represent the set of real numbers or complex numbers.
  • $\mathbf{F}^{m, \ n}$ is the set of all $m$-by-$n$ matrices where the entries in each matrix are in $\mathbf{F}$.

Question.

Is there an efficient way of going about proving this? I know one way would be by showing (a) implies the others, then showing (b) implies the others, etc. But that would mean I need to show $4*5=20$ different implications!

Can I get away with showing that each of the above imply that $\text{null }T = \{0\}$? I have a feeling no.

Best Answer

Plan.

We will prove this by showing

$$ (a) \implies (b) \implies (c) \implies (d) \implies (e) \implies (a) $$

This is sufficient to show the above are equivalent to each other due to the transitivity of implications.

In the proofs below, we will make use of the following definitions and results from the text up to this exercise:

enter image description here

enter image description here

enter image description here

enter image description here

enter image description here

enter image description here

enter image description here

enter image description here

Any other statements that aren't justified in the proofs below should already be well known by the time the reader comes to this exercise in the text.

Also, the proofs below assume $\mathcal{M}(T)$ is denoted as \begin{pmatrix} A_{1,1} & \cdots & A_{1,n} \\ \vdots & & \vdots \\ A_{n,1} & \cdots & A_{n,n} \end{pmatrix}


Proofs.

$(a) \implies (b)$

Suppose $T$ is injective. This implies $\text{null }T = \{0\}$, that is, $v=0$ is the only vector to make the equation $Tv=0$ true. To show this implies $(b)$ is to show that

$$ \lambda_1 \begin{pmatrix} A_{1,1} \\ \vdots \\ A_{n,1} \end{pmatrix} + \cdots + \lambda_n \begin{pmatrix} A_{1,n} \\ \vdots \\ A_{n,n} \end{pmatrix} = 0 \tag{1} $$

is true only when $\lambda_1 = \cdots = \lambda_n = 0$.

By definition $\color{gray}{3.31}$, we have

$$ Tv_k = A_{1,k}u_1 + \cdots + A_{n,k}u_n \tag{2} $$

for $k=1,\ldots,n$. This implies that

$$ \lambda_kTv_k = \lambda_kA_{1,k}u_1 + \cdots + \lambda_kA_{n,k}u_n \tag{3} $$

for $k=1,\ldots,n$.

Notice that the coefficients in equation $(3)$ correspond to the entries of $\lambda_k\mathcal{M}(T)$, and hence, represent each term in equation $(1)$. Thus, we can consider the equation $\lambda_kTv_k = 0$ and see when it would be true. We know $Tv_k \neq 0$ since $v_k \notin \text{null }T$.

The reason $v_k \notin \text{null }T$ is because if otherwise, it would imply $v_k = 0$ since $\text{null }T = \{0\}$, and it would contradict that $v_k$ is part of a linearly independent list.

Since $Tv_k \neq 0$, then the only way for $\lambda_kTv_k = 0$ is when $\lambda_k = 0$ for $k = 1, \ldots, n$. This, along with equation $(1)$, shows that the columns of $\mathcal{M}(T)$ are linearly independent. We have thus shown $(a) \implies (b)$.


$(b) \implies (c)$

Suppose the columns of $\mathcal{M}(T)$ are linearly independent in $\mathbf{F}^{n,1}$. There are $n$ of these columns. Also, $\text{dim }\mathbf{F}^{n,1}=n$ by $\color{blue}{3.40}$. Thus, by $\color{blue}{2.38}$, the columns of $\mathcal{M}(T)$ is a basis of $\mathbf{F}^{n,1}$, and hence span $\mathbf{F}^{n,1}$. We have thus shown $(b) \implies (c)$.


$(c) \implies (d)$

Suppose the columns of $\mathcal{M}(T)$ span $\mathbf{F}^{n,1}$. There are $n$ of these columns. Also, $\text{dim }\mathbf{F}^{n,1}=n$ by $\color{blue}{3.40}$. Thus, by $\color{blue}{2.42}$, the columns of $\mathcal{M}(T)$ is a basis of $\mathbf{F}^{n,1}$. Hence, the column rank of $\mathcal{M}(T)$ equals $n$ by definition $\color{gray}{3.52}$. But the row rank of $\mathcal{M}(T)$ equals the column rank of $\mathcal{M}(T)$ by $\color{blue}{3.57}$. With the facts that $\text{dim }\mathbf{F}^{1,n}=n$, row rank of $\mathcal{M}(T)$ equals $n$ and the span of the rows of $\mathcal{M}(T)$ is a subspace of $\mathbf{F}^{1,n}$, we have, by $\color{blue}{2.39}$, that the rows of $\mathcal{M}(T)$ span $\mathbf{F}^{1,n}$. We have thus shown $(c) \implies (d)$.


$(d) \implies (e)$

Suppose the rows of $\mathcal{M}(T)$ span $\mathbf{F}^{1,n}$. Since $\text{dim }\mathbf{F}^{1,n}=n$ and since row rank of $\mathcal{M}(T)$ equals $n$, we must have, by $\color{blue}{2.42}$, that the rows of $\mathcal{M}(T)$ is a basis of $\mathbf{F}^{1,n}$, and hence linearly independent. We have thus shown $(d) \implies (e)$.


$(e) \implies (a)$

The approach here will be to investigate the columns of $\mathcal{M}(T)$ to show $T$ is injective rather than the rows of $\mathcal{M}(T)$ because definition $\color{gray}{3.31}$ is with respect to the columns.

Suppose the rows of $\mathcal{M}(T)$ are linearly independent in $\mathbf{F}^{1,n}$. This implies the rows of $\mathcal{M}(T)$ is a basis of $\mathbf{F}^{1,n}$. Thus, the row rank of $\mathcal{M}(T)$ equals $n$ by definition $\color{gray}{3.52}$. This implies the column rank of $\mathcal{M}(T)$ equals $n$ by $\color{blue}{3.57}$; this means the $n$ columns of $\mathcal{M}(T)$ span $\mathbf{F}^{n,1}$, implying they're a basis of $\mathbf{F}^{n,1}$ and hence, linearly independent.

The $n$ columns of $\mathcal{M}(T)$ being linearly independent implies any column of $\mathcal{M}(T)$ cannot be expressed as a linear combination of the other columns. What this means in particular is that for all $k=1,\ldots,n, \ j=1,\ldots,n$ where $k \neq j$ we have

$$ \begin{pmatrix} A_{1,k} \\ \vdots \\ A_{n,k} \end{pmatrix} \neq \begin{pmatrix} A_{1,j} \\ \vdots \\ A_{n,j} \end{pmatrix} \tag{4} $$

Consider a $v_k$ and $v_j$ in the basis $v_1, \ldots, v_n$ of $V$ such that $k \neq j$. This implies $v_k \neq v_j$ since they are in a linearly independent list. We want to show $Tv_k \neq Tv_j$.

By definition $\color{gray}{3.31}$, we have that

$$ Tv_k = A_{1,k}u_1 + \cdots + A_{n,k}u_n \\ Tv_j = A_{1,j}u_1 + \cdots + A_{n,j}u_n $$

Since statement $(4)$ is true by our assumption that $k \neq j$, we must have

$$ A_{1,k}u_1 + \cdots + A_{n,k}u_n \neq A_{1,j}u_1 + \cdots + A_{n,j}u_n $$

implying $Tv_k \neq Tv_j$. That is, we have shown $(v_k \neq v_j) \implies (Tv_k \neq Tv_j)$, which means $T$ is injective by the contrapositive of definition $\color{gray}{3.14}$. We have thus shown $(e) \implies (a)$.

Related Question