[Math] Tensors = matrices + covariance/contravariance

matricesmultilinear-algebratensors

I have read several topics on tensors but it is still not clear to me. Tensors are different from matrices because they contain additional information about how do they transform.

To fully specify a $n\times n$ matrix, one need to specify its $n^2$ components.

My question is: what do someone need to specify to fully define a tensor ?
Is it true to say that, to fully define a 2D tensor, one need to provide its $n^2$ components plus whether the tensor is covariant or contravariant along each dimension ? (or is there something else) ?

Best Answer

Usually, a matrix is thought of a representation of a linear operator: a map that takes a vector and spits out another vector. Say $A$ is some linear operator and $v$ is some vector, then $A(v)$ is the output vector.

An equivalent way of looking at it, however, is to say that there is a map $B$ that takes two vectors $v, w$ and spits out a scalar, given by $B(v,w) = A(v) \cdot w$, say. Such a map is what is usually described in the literature when talking about tensors.

Where do contravariance and covariance come in? Well, the above idea of a tensor is actually a bit of a cheat. There might not be an inner product; we might not be able to freely convert between vectors and covectors using it. So instead of saying that $B$ takes two vectors as arguments, let $B$ be a map taking one vector $v$ and a covector $\alpha$ instead, so that $B(v, \alpha) = \alpha(A(v))$.

(You'll note that, if there is a way to convert from vectors to covectors, then any tensor acting on $p$ vectors and $q$ covectors could be converted to one that acts on $p+q$ vectors, for instance.)

A general tensor could take any number of vector or covector arguments, or a mix of the two in any number.

In physics, it's common to look at the components of a tensor with respect to some basis--rather than supply whatever vectors or covectors that might be relevant to a problem, we supply a set of basis vectors and covectors instead, so we need only remember the coefficients. If $e_i$ is the $i$th basis vector and $e^j$ is the $j$th basis covector, then $B(e_i, e^j) = {B_i}^j$ takes us from the more math-inclined definition of a tensor to the more familiar form to a physicist.

Related Question