What does the dot product of a tensor and a vector represent

inner-productslinear algebratensorsvectors

The dot product (or inner product) of a tensor T and a vector a produces a vector b = T . a:

$$
b_i = T_{ij}a_j =
\begin{pmatrix}
T_{11} a_1 + T_{12} a_2 + T_{13} a_3\\
T_{21} a_1 + T_{22} a_2 + T_{23} a_3\\
T_{31} a_1 + T_{32} a_2 + T_{33} a_3
\end{pmatrix}
$$

The operation is non-commutative if T is non-symmetric, hence for an arbitrary vector $\mathbf{c}$,
$$\mathbf{c} = \mathbf{a} \cdot \mathbf{T} = \mathbf{T}^\mathrm{T} \cdot \mathbf{a} $$

I have seen the use of this operation in calculations of viscous forces on a body within the computational fluid dynamics context.

The meaning of the dot product of two vectors has been well explained below:

But, what is the meaning of the dot product of a tensor and a vector, if there is any?

Best Answer

If you compare your formula for $T_{ij}a_j$ to that of matrix multiplication of a matrix $A$ with $ij$ entry $A_{ij}$ and a vector $v=(v_1,...,v_n)$, you will see that $T\cdot a$ is just matrix multiplication of the matrix $T$ with entries $T_{ij}$ by the vector $a=(a_1,...,a_n)$. So the short answer to your question is 'it represents everything matrix multiplication represents'. To give a longer answer we can try to explain why we get this formula for $T a$.

Given a vector space $V$, an "$n$-tensor" refers to an element of the vector space $\otimes_{i=1}^n V$, where $\otimes$ is the tensor product. Then a vector in $v$ is a 1-tensor, and what you've called $T$ is a 2-tensor.

The dot product allows us to identify an element of $V$ with a linear map from $V\to \mathbb{R}$, because from $v\in V$ we can define the linear map $L:V\to\mathbb{R}$ by $L(w) = v\cdot w$ for any $w\in V$. This gives us an operation we can apply on pairs of vectors.

Using this fact we can identify the space of 2-tensors, $V\otimes V$ with the space of linear maps $V\to V$ by sending a pure 2-tensor $a\otimes b$ to the linear map $L_{ab}$ taking $v\to(v\cdot b) a$ and extending linearly. We get matrices from linear maps by writing the map in a basis, so if we pick a basis $\{e_i\}_{i=1}^{\dim V}$ for $V$, we get a basis of the pairs $\{e_i\otimes e_j\}$ for $V$. Then as we learned in our lin alg classes, the $ij$th entry of the matrix representing $T:V\to V$ in the $e_i$ basis is $T_{ij} = e_i \cdot (Te_j).$

So in the end we see that the dot product identifies the tensor $T$ with a linear map $V\to V$, and in a basis we recover the formula for matrix multiplication.

Related Question