[Math] Proving a projection is a linear map

linear algebravector-spaces

OK, I think I am on the right track but I am trying to figure out if I am going wrong somewhere.

Let $V$ be a vector space over a field $K$. Let $B = \{x_1, x_2, x_3, \ldots , x_n\}$ be a base of $V$ over $K$. Then for all $x \in V$ there exist unique scalars $\lambda_1, \lambda_2,\ldots, \lambda_n$ in $K$ such that $\sum_{i=1}^n \lambda_1 x_i = x$.

We call the $\lambda_i$ the $i$-th component of $x$ with respect to base $B$. Let $p_i(x)$ denote the $i$-th component of $x$ with respect to base $B$, where $p_i$ is a projection map.

Prove the following:

  1. That $p_i: V \to K$ is a linear map and therefore $p_i \in \operatorname{Hom}_k(V, K)$.
  2. Show that $\{p_1, p_2,\ldots, p_n\}$ is a base of $\operatorname{Hom}_K(V,K)$.
  3. Show that there is an isomorphism $\Psi : V \to \operatorname{Hom}_K(V,K)$ such that $\Psi(x_i) = p_i$ for all $1 \le i \le n$.

I got through part of this. My reasoning was that since $p_i$ is a projection onto $V$ then by definition any vector $p_i(x) \in V$ which means that like any other vector in the space it can be expressed as a linear combination of some scalar (call it $\mu$) such that $p_i(x) =\{\mu_1 x_1 + \mu_2 x_2 + \cdots + \mu_n x_n\} = \sum^n_{i=1}\mu_i x_i$.

Since it can be expressed that way, that means that if we pick any other vector in $V$, say $y$, that can also be expressed as a linear combination of the base components (say, $\lambda_ix_i$). So if we add $x$ and $y$ and plug them into the projection map we get

$p_i(x+y) = \sum^n_{i=1}\mu_ix_i + \lambda_ix_i$ which with a little algebra can be separated into $\sum_{i=1}^n \mu_i x_i + \sum_{i=1}^n \lambda_i x_i$. A similar process can prove that it preserves scalar multiplication.

Is this reasoning correct? Because if it is, my next step was to say that given that $p_i$ is a linear map, and that it is from $V$ to $V$, the number of dimensions it has is going to be the same as the dual space. But I am not convinced that's a good reason.

Any input would be welcome.

Best Answer

First let's prove that $p_i : V \to K$ is a linear map. Let $v, w \in V$, where $v = a_1x_1 + \dots a_nx_n$ and $w = b_1x_n + \dots b_nx_n$. Since $v + w = (a_1 + b_1)x_1 + \dots + (a_n + b_n)x_n$, we have $p_i(v + w) = a_i + b_i = p_i(v) + p_i(w)$. Now suppose $c \in K$. Clearly, $cv = (ca_1)x_1 + \dots + (ca_n)x_n$, and thus $p_i(cv) = ca_i = cp_i(v)$.

For the second question, if you can use the fact that $\operatorname{Hom}_K(V, K)$ has dimension $n$, it suffices to show that $\{p_1, \dots, p_n\}$ is linearly independent. Assume $c_i \in F$ such that

$$c_1p_1 + \dots + c_np_n = 0 \in \operatorname{Hom}_K(V, K)$$

If $c_i \ne 0$ for some $i$, we have

$$(c_1p_1 + \dots + c_np_n)(x_i) = c_1p_1(x_i) + \dots + c_np_n(x_i) = c_ip_i(x_i) = c_i \in K$$

in which case $c_1p_1 + \dots + c_np_n$ is not zero. Alternatively, it's not too hard to show that $\{p_1, \dots, p_n\}$ is a spanning set for $\operatorname{Hom}_K(V, K)$. Any $T \in \operatorname{Hom}_K(V, K)$ is completely determined by its value at the basis elements, so say we have $T(x_i) = \lambda_i$. Then

$$(\lambda_1p_1 + \dots + \lambda_np_n)(x_i) = \lambda_1p_1(x_i) + \dots + \lambda_np_n(x_i) = \lambda_ip_i(x_i) = \lambda_i $$

Since both maps agree on basis elements, we conclude that $\lambda_1p_1 + \dots + \lambda_np_n = T$.

For 3, I don't think there is much to show. Once you define $\Psi: V \to \operatorname{Hom}_K(V, K)$ on the basis elements by $\Psi(x_i) = p_i$. This extends $K$-linearly to a well-defined linear map $V \to \operatorname{Hom}_K(V, K)$, which is clearly an isomorphism.

Related Question