[Math] Interpretation of $(r,s)$ tensor

differential-geometrylinear algebratensor-ranktensorsvector-spaces

A tensor of type $(r,s)$ on a vector space $V$ is a $C$-valued function $T$ on $V×V×…×V×W×W×…×W$ (there are $r$ $V$'s and $s$ $W$'s in which $W$ is the dual space of $V$) which is linear in each argument. We take $(0, 0)$-tensors to be scalars, as a matter of convention. The interpretations of $(r,0)$-tensors are trivial, since they are definitions of multilinear functionals (as a special case $(1,0)$-tensor interpreted as covector (elements of dual space)). We can interpret $(1,1)$ tensors as follows: $A(v,f ) ≡ f (Av)$. Say we have a linear operator $R$; then we can turn $R$ into a second rank tensor $T$ by $T(v,w) ≡ v · Rw$ where $·$ denotes the usual dot product of vectors. If we compute the components of $T$ we find that the components of the tensor $T$ are the same as the components of the linear operator $R$. Ok. Everything is good. But I cant understand interpretations of other $(r,s)$-tensors. For example I found in Wikipedia $(0,1)$-tensor interpreted as a vector or $(0,2)$ as a bivector and in general $(0,s)$ tensor as $s$-vector tensor; or $(2,1)$ tensor as cross product and so on. I want you to show how the tensors in general interpreted. Is it possible for you to show these interpretations like as I did for $(1,1)$-tensor ?

Best Answer

Any alternating $(r,s)$ tensor has a corresponding map that goes $\Lambda^r V \to \Lambda^s V$. Suppose $R \in \Lambda^r V$ and $\Sigma \in \Lambda^s V^*$. Then define $\underline T:\Lambda^r V \to \Lambda^s V$ such that

$$T(R, \Sigma) = \Sigma[ \underline T(R)]$$

The uniqueness of $\underline T$ can be proved by taking a "gradient" with respect to the vector space of $\Lambda^s V^*$.

Geometrically, $\underline T$ maps an $r$-vector (which corresponds to an $r$-dimensional subspace) to an $s$-vector, and the $s$-covector $\Sigma$ allows us to extract the components of $\underline T(R)$.

Related Question