Duality Functor – Understanding in Linear Algebra

category-theoryexamples-counterexamplesintuitionlinear algebra

I'm trying to gain an intuitive understanding of the following construction:

For any vector space $M$ over a field $R$, one can define the algebraic dual of $M$ as $M^* := \mathsf{Hom}(M, R)$ and given another vector space $N$ one can define the algebraic dual of a linear map $A \in \mathsf{Hom}(M,N)$ according to $A^*(\omega) = \omega \circ A$. This establishes a linear mapping from $N^*$ to $M^*$ and since for suitable linear maps $A$ and $B$, $(A \circ B)^* = B^* \circ A^*$ and $(1_A)^* = 1_{A^*}$ the "duality" operation $*$ sets up an endomorphic (contravariant) functor on the category of vector spaces.

I understand the basics of this construction, but what I would like to see are some good concrete examples that would illustrate the abstractions in a meaningful way. It's easy to come up with specific examples of linear maps and dual spaces where one applies this to specific linear forms, but are there some examples that would shed light on the motivation for these definitions and relations?

Ironically, I think I have a better understanding of this in purely categorical terms since the duality construct shows up repeatedly in different contexts and when I think of "duality" I think of precisely this construct…

Best Answer

Let $V$ be a finite-dimensional vector space with basis $e_1, ... e_n$. Then we may write any vector $v$ in the form

$$v = \sum c_i e_i$$

for some coefficients $c_i$. Sending a vector $v$ to the coefficient $c_i$ for fixed $i$ defines a linear functional $e_i^{\ast} : V \to k$. These linear functionals together constitute the dual basis to $V$, and what confused me for a long time is that linear functionals do not transform in the same way as vectors under change of coordinates; we say that vectors transform covariantly but linear functionals transform contravariantly. Before I understood this I was constantly getting confused about the difference between transforming a vector and transforming its components.

For an infinite-dimensional example, consider the vector space $k[x]$ of polynomials in one variable over a field. It has a distinguished set of dual vectors given by the functions $[x^n]$ which return the coefficient of $x^n$ in a polynomial. To be suggestive you can write these functions as $\frac{1}{n!} \frac{d^n}{dx^n}_{x = 0}$. It turns out that the dual space $k[x]^{\ast}$ is precisely the product of the spaces containing each of these dual vectors; for example, the dual space contains vectors that ought to be called $$(e^{t \frac{d}{dx} })_{x=0} = \sum_{n \ge 0} \frac{t^n}{n!} \frac{d^n}{dx^n}_{x=0}$$

that given a polynomial $f(x)$ return the numerical value of $f(t)$.

Thinking of $\frac{d^0}{dx^0}_{x=0}$ as a toy model for the Dirac delta function, you can think of this construction as a toy model for (Schwartz) distributions.


In differential geometry, the dual of a tangent space $T_p(M)$ at a point $p$ on a manifold $M$ is the cotangent space $T_p^{\ast}(M)$ at $p$. Just as the tangent space captures the infinitesimal behavior of smooth functions $\mathbb{R} \to M$ near $p$ (curves), the cotangent space captures the infinitesimal behavior of smooth functions $M \to \mathbb{R}$ near $p$ (coordinates). Just as a nice family of tangent vectors gives a vector field, a nice family of cotangent vectors gives a 1-form. In classical mechanics, the cotangent bundle is the phase space of a classical particle traveling on $M$; cotangent vectors give momenta.


For me duality really shines when you combine it with tensor products and start using the language of tensors. Then you can describe any kind of linear-ish thing using a combination of tensor products and duals, at least for finite-dimensional vector spaces:

  • What's a linear function $V \to W$? It's an element of $V^{\ast} \otimes W$.
  • What's a bilinear form $V \times V \to k$? It's an element of $V^{\ast} \otimes V^{\ast}$.
  • What's a multiplication $V \times V \to V$? It's an element of $V^{\ast} \otimes V^{\ast} \otimes V$.

When you have a bunch of linear-ish things around, writing them as all tensors helps you keep track of exactly how you can combine them (using tensor contraction). For example, an endomorphism $V \to V$ is an element of $V^{\ast} \otimes V$, but I have a distinguished dual pairing $$V^{\ast} \otimes V \to k.$$

What does this do to endomorphisms? It's just the trace!