[Math] Gradient of matrix-vector product

vector analysis

Is there a way to make the identity of a gradient of a product of matrix and vector, similar to divergence identity, that would go something like this:

$\nabla(\textbf{M}.\textbf{c})= \nabla(\textbf{M}).\textbf{c}\ +\ … (\text{not necessarily like this}) $,

where M is a $n\times n$ matrix and $c$ is a $n\times 1$ matrix (column vector)?

Best Answer

Let us write the matrix-vector product ${\bf M}\cdot {\bf c}$ in index notation (Einstein convention). Using the product rule, the gradient of $({\bf M}\cdot {\bf c})_{i} = M_{ij} c_j$ satisfies $$ \left(\nabla({\bf M}\cdot {\bf c})\right)_{ik} = M_{ij,k} c_j + M_{ij} c_{j,k} = ({\bf c}\cdot\nabla({\bf M}^\top) + {\bf M}\cdot \nabla {\bf c})_{ik} \, . $$ Similarly, one shows that the vector-matrix product $({\bf c}\cdot {\bf M})_{j} = c_i M_{ij}$ satisfies $$ \left(\nabla({\bf c}\cdot{\bf M})\right)_{jk} = c_{i,k} M_{ij} + c_i M_{ij,k} = ({\bf M^\top}\!\cdot\nabla {\bf c} + {\bf c}\cdot \nabla{\bf M})_{jk} \, , $$ Therefore, \begin{aligned} \nabla({\bf M}\cdot {\bf c}) &= {\bf c}\cdot\nabla({\bf M}^\top) + {\bf M}\cdot \nabla {\bf c} \\ \nabla({\bf c}\cdot {\bf M}) &= {\bf M^\top}\!\cdot\nabla {\bf c} + {\bf c}\cdot \nabla{\bf M} \end{aligned}