In the notation $T\colon\mathbb{R}^{n} \to \mathbb{R}^m$, does $\mathbb{R}^m$ refer to the codomain or the image of $T$

notationterminologytransformationvector-spacesvectors

Question:

It is my understanding that the term codomain refers to a set, of which the image is a subset (although, not necessarily a proper subset).

However, when speaking of vector transformations using notation like "$T\colon\mathbb{R}^{n} \to \mathbb{R}^m$",

  1. am I correct in understanding that $\mathbb{R}^m$ refers to the codomain, not the image?
  2. in the case of vector transformations, is it appropriate to speak of the image as a subspace of the codomain (Vs. a subset)?

Apologies in advance for asking two questions, but given that the first is basically yes/no, I figured it was okay.


My Work (Thoughts)

I am fairly certain that the image of a linear transformations is a subspace of its codomain, because, in that case, the transformation has a matrix representation

$$\boldsymbol{M} \, \in \, \mathbb{M}_{{\color{red}m}\times n}(\mathbb{R}),$$

and $\textrm{Im}\{\boldsymbol{M}\}$ is, in general, a subspace of $\mathbb{R}^{\color{red}m}$; however, I'm always wary of assuming the language can cross over (i.e. from the matrix to the transformation, in this case).

Moreover, based on the little I know about non-linear transformations, I would not expect their images to define a subspace. As such, I would assume that the term subset should always be used when describing the general relationship between the image and codomain.


Background / Level

Aside from real-valued functions and the usual fare seen in vector-calculus, I have no real experience with non-linear vector transformations. I have also never properly understood the importance of the distinction between image and codomain.

Best Answer

You are correct in that $\mathbb{R}^m$ is the codomain of this map $T$. However, lets explore the roots of your confusion as noted at the end of your question.

The image of $T$ is defined as $T(\mathbb{R}^n) := \{T(x): x \in \mathbb{R}^n \}$. As you can see, this is a subset (and if $T$ is linear, also a subspace ${}^\dagger$) of $\mathbb{R}^m$ and is read as "the set of all those $T(x)$ such that $x$ lies in $\mathbb{R}^n$". This set certainly can be equal to the whole codomain if $T$ is a surjective map, i.e $T(\mathbb{R}^n) = \mathbb{R}^m$. If $T$ is not surjective, then there is a vector in $\mathbb{R}^m$ who is not the image of some vector in $\mathbb{R}^n$.

Note, we can adapt this notion of the image of a set to subsets of the domain: e.g., let $A \subset \mathbb{R}^n$ then the image of $A$ is $T(A) := \{T(x): x \in A\}$.

Note: It is common to see other notations to denote the image of a map such as $im \ T$ or rather uncommonly, the name of the map will appear as subscript to aide in specifying the image of a particular set (with respect to $T$): $im_{T}(A)$.


${}^\dagger$ Proof that if $A$ is a subspace of $\mathbb{R}^n$, and $T$ is a linear transformation, $T(A)$ is a subspace of $\mathbb{R}^m$.

Suppose that $w_1, w_2 \in T(A)$. Then, by definition of $T(A)$ there exists $u_1, u_2 \in A$ such that $w_{1} := T(u_{1})$ and $w_{2} := T(u_{2})$. Now, we want to show that $T(A)$ is closed under linear combinations; consider the linear combination $\alpha w_{1} + \beta w_{2}$ where $\alpha, \beta$ are scalers. Therefore we can compute: $$\begin{align*} \alpha w_{1} + \beta w_{2} &= \alpha T(u_1) + \beta T(u_2) \\ &= T(\alpha u_1) + T(\beta u_2) \ \ \ by \ the \ linearity \ of \ T \\ &= T(\alpha u_1 + \beta u_2) \ \ \ again \ by \ the \ linearity \ of \ T. \end{align*}$$ So since $A$ is a subspace, $u_{1}, u_2 \in A \implies \alpha u_1 + \beta u_2 \in A.$ Thus, $\alpha w_{1} + \beta w_{2} \in T(A)$, since it is the image of the vector $\alpha u_1 + \beta u_2 \in A$. So $T(A)$ is closed under linear combinations and hence is a subspace of $\mathbb{R}^m$ $\blacksquare$

Related Question