The multiplication of a tensor by a matrix (or by a vector) is called $n$-mode product.
Let $\mathcal{T} \in \mathbb{R}^{I_1 \times I_2 \times \cdots \times I_N}$ be an $N$-order tensor and $\mathbf{M} \in \mathbb{R}^{J \times I_n}$ be a matrix. The $n$-mode product is defined as
$$ (\mathcal{T} \times_{n} \mathbf{M})_{i_{1}\cdots i_{n-1}ji_{n+1}\cdots i_N} = \sum \limits_{i_n = 1}^{I_n} \mathcal{T}_{i_{1}i_{2}\cdots i_{n}\cdots i_{N}}\mathbf{M}_{ji_{n}}.$$
Note this is not a standard product like the product of matrices. However, you could perform a matricization of the tensor along its $n$-mode (dimension $n$) and thus effectuate a standard multiplication.
The $n$-mode matricization of $\mathcal{T}$, say $\mathbf{T}_{(n)}$, is an $I_{n} \times I_{1}\cdots I_{n-1}I_{n+1}\cdots I_{N}$ matrix representation of $\mathcal{T}$. In other words, it is just a matrix form to organize all the entries of $\mathcal{T}.$
Hence, the multiplications below are equivalent
$$\mathcal{Y} = \mathcal{T} \times_{n} \mathbf{M} \iff \mathbf{Y}_{(n)} = \mathbf{M} \mathbf{T}_{(n)}, $$ where $\mathbf{Y}_{(n)}$ is the $n$-mode matricization of the tensor $\mathcal{Y} \in \mathbb{R}^{I_1 \times \cdots \times I_{n-1} \times J \times I_{n+1} \times \cdots \times I_N}$.
For more details, see Tensor Decompositions and Applications.
Index notation illustrates the problem quite well.
Denoting $\frac{\partial}{\partial x_n}$ by $\partial_n$, we have
$$\eqalign{
C_{ik} &= A_{ij}B_{jk} \cr
\partial_nC_{ik} &= \big(\partial_n A_{ij}\big)B_{jk} + A_{ij}\big({\partial_n B_{jk}}\big) \cr
}$$
The first term is fine for the contraction over the $j$-index, but in the second term
the index is sandwiched between the $n$ and $k$ indices. Since the term is a third-order tensor, there is no way to fix it. This is unlike the following case.
In the case where $(A,B)$ are vectors, simply omit the $(i,k)$ indices to obtain
$$\eqalign{
C &= A_{j}B_{j} \cr
\partial_nC &= \big(\partial_n A_{j}\big)B_{j} + A_{j}\big({\partial_n B_{j}}\big) \cr
}$$
This expression can be fixed by transposing the second term.
Generalizing in the other direction, if $X$ is a matrix, $\frac{\partial}{\partial X_{nm}}=\partial_{nm}$, and appending the $m$-index results in
$$\eqalign{
\partial_{nm}C_{ik} &= \big(\partial_{nm} A_{ij}\big)B_{jk} + A_{ij}\big({\partial_{nm} B_{jk}}\big) \cr
}$$
Now the gradients in parentheses are fourth-order tensors, and once again there is no simple operation that will re-order the indices.
The simplest way to avoid the problem is to use differentials instead of gradients.
$$\eqalign{
C &= A\star B \cr
dC &= dA\star B + A\star dB \cr
}$$
where $(A,B,C)$ can be scalars, vectors, or tensors, and $(\star)$ can be any kind of product (Kronecker, Hadamard, Frobenius, tensor, matrix) which is compatible with their given dimensions.
Best Answer
Use Leibniz' Rule $$(M_{ij}N_{jk})_{,r}=M_{ij,r}N_{jk}+M_{ij}N_{jk,r}$$ where ${}_{,r}$ means differentiation. One writes $M_{ij}N_{jk}$ (for matrix multiplication) which is used to calculate composition.