I heard that tensors are a generalization of scalars, vectors, and matrices. But tensors don’t look like matrices at all.

matricesmultilinear-algebratensors

I am reading "Analysis on Manifolds" by James R. Munkres.

Definition:
Let $V$ be a vector space. Let $V^k = V \times \cdots \times V$ denote the set of all $k$-tuples $(v_1, \cdots, v_k)$ of vectors of $V$. A function $f : V^k \to \mathbb{R}$ is said to be linear in the $i$th variable if, given fixed vectors $v_j$ for $j \ne i$, the function $T : V \to \mathbb{R}$ defined by $$T(v) = f(v_1, \cdots, v_{i-1}, v, v_{i+1}, \cdots, v_k)$$ is linear. The function $f$ is said to be multilinear if it is linear in the $i$th variable for each $i$. Such a function $f$ is also called a $k$-tensor, or a tensor of order $k$, on $V$.

This is the definition of tensors.

I heard that tensors are a generalization of scalars, vectors, and matrices. But tensors don't look like scalars, vectors, and matrices at all.

For example, please show me a tensor which corresponds to a matrix.

Best Answer

One way to think about what a tensor does is ask how many arguments it has to accept before it returns a scalar. What you have written above shows a function which is linear in $k$ arguments, and when you feed it $k$ vectors, you get back a number. Often times, it is convenient to distinguish between rows and columns as vectors, for example, if you are not in the presence of a natural inner product. In this case, instead of simply speaking about $k$-tensors, we can speak of $(i,j)$ tensors, where the tensor accepts $i$ rows and $j$ columns as argument.

A matrix can be regarded as a $2$-tensor, or more specifically a $(1,1)$ tensor. If $M$ is the matrix, and $v,w$ are vectors, then the matrix accepts two arguments. You write $v^TMw$, and this evaluates to a number.

Similarly, if $v$ is a vector, then this is a $(1,0)$ tensor, since multiplying on the left by a row vector gets you a number.

If you are in the presence of an inner product, the process of applying the inner product to turn vectors into covectors or conversely is called lowering/raising of indices.