Divergence of tensor times vector equals divergence of vector times tensor

calculustensorsvector analysis

Does the following equation hold? $\vec T := \vec T(\vec x)$ is a tensor field, $\vec v := \vec v(\vec x)$ a vector field:

$\text{div} \vec T \cdot \vec a = \text{div}(\vec a \cdot \vec T)$

I think it does not hold, but this would mean that I made a mistake in the following derivation ($T_{ij,j} := \frac{\partial T_{ij}}{\partial x_j}$):

$\text{div} \vec T \cdot \vec a = T_{ij,j} \vec e_i \cdot a_k \vec e_k = T_{ij,j}a_k\delta_{ik} = T_{ij,j}a_i$

$\text{div}(\vec a \cdot \vec T) = \text{div}(a_jT_{ji}\vec e_i) = T_{ji,i}a_j$

Is there an error or is the above equation true?

Best Answer

You need to use the Leibniz rule in the second case

$\mathrm{div}(a_jT_{ji}e_i) = (a_jT_{ji})_{,i}=a_{j,i}T_{ji}+a_jT_{ji,i}$.