[Math] Dot product of the vector r and second order tensor

linear algebratensors

Position vector r is given as $\vec r=x_i\hat e_i$ and the second order tensor T is given as: $\overline{\overline{T}}=\frac{\delta_{ij}\hat e_i\hat e_j}{r}+\frac{x_ix_j \hat e_i\hat e_j}{r^3}$.

How to determine (dot product) $\vec r.\overline{\overline{T}}$ by using index notation rules?
I would appreciate any suggestions as I don't know whether I can dot the vector into the two parts of the tensor separately as $\vec r.\overline{\overline{T}}=\vec r.\overline{\overline{T_1}}+\vec r.\overline{\overline{T_2}}$ or not.

Best Answer

If a rank 2 tensor $T$ is thought of as a matrix $M$ (in certain standard basis), $T\cdot v$, for $v$ a (column) vector, is simply $Mv$. However, $v\cdot T$ is $v^tM$ with $t$ denoting transpose.

In your index notation, if $T=T_{ij}e_ie_j$ and $r=x_ke_k$, then $$ \begin{aligned} r\cdot T &= x_kT_{ij} (e_k\cdot e_i)e_j = x_kT_{ij}\delta_{ik}e_j = x_iT_{ij}\\ T\cdot r &= x_kT_{ij} e_i(e_j\cdot e_k) = x_kT_{ij}\delta_{jk}e_i = T_{ij}x_j \end{aligned} $$ Generally, dotting from right or left matters. However, if your rank 2 tensor is symmetric, i.e. $T_{ij}=T_{ji}$, then the left and right does not matter. Your tensor $$T=\frac{r^2\delta_{ij}+x_ix_j}{r^3}e_ie_j$$ is symmetric.

For a rank $n$ tensor $T$, the situation is even more complicated. Because now the notion of $T\cdot v$ needs extra clarification. It is a good idea to write $T\cdot_m v$, meaning the dot product is done over the $m$th component. Or better yet, avoid using dot products in this form altogether. Either stick to indexes from start to finish or use the relevant abstract notations related to tensors.

Related Question