[Math] If a tensor space is a vector space, then why isn’t a tensor a vector

tensorsvectors

On page 78 of Tensor
Analysis on Manifolds
, Bishop and Goldberg state:

Let $V$ be a vector space. The scalar-valued multilinear functions with variables all in either $V$ or $V^*$ are called tensors over $V$ and the vector spaces they form are called the tensor spaces over $V$.

I thought a vector was a tensor and not the other way around. I know a tensor is a multilinear map from a set of vector spaces and dual spaces onto $\Bbb{R}$.

Is a vector a tensor or a tensor a vector? Can someone clarify this for me?

Best Answer

A tensor is a vector of the tensor space $T^{(m, n)}(V)$, which is itself forms a vector space.

A tensor space $T^{(m, n)}(V)$ is defined over a vector space $V$ as the set of functions $f$ with domain/image:

$$ T^{(m, n)}(V) = \{ f : V^m \times (V^*)^n \to \mathbb{R} \} $$

Members of $T^{(m, n)}(V)$ are therefore of course different from members of $V$, one obvious reason is that they have the different dimensions in general: $dim(V)$ vs $dim(V) ^ {m \times n}$

Although all vector spaces of same dimension are isomorphic, the tensor space has a bit more structure than just the underlying vector space: we need to specify $m$ and $n$ for it as well. So given a arbitrary vector space $W$ it doesn't make much sense to say that members of that space are tensors unless we specify more information about which tensor space we are talking about by giving $m$ and $n$.

One identification that is natural to make is identify vectors of v of V as (0, 1) tensors over V. By definition, a (0, 1) tensor is a function T(f) that:

  • takes a dual vector $f$ as input, where $f$ by definition of dual space is a funcation that maps a vector v of V to a number
  • and outputs a number

Therefore, given $v$, we can create a (0, 1) tensor as:

$$ T(f) = f(v) $$

Related Question