If $f(\mathbf{r})=\vert\mathbf{r}\vert^4$
How would you calculate $\operatorname{grad} f$ in index notation?
I get that $\vert\mathbf{r}\vert^2=x_ix_i$ but how do I represent $f$ in index notation?
index-notationvector analysis
If $f(\mathbf{r})=\vert\mathbf{r}\vert^4$
How would you calculate $\operatorname{grad} f$ in index notation?
I get that $\vert\mathbf{r}\vert^2=x_ix_i$ but how do I represent $f$ in index notation?
I’m very surprised that this simple question remains unanswered for nine months
Plus I don’t know why do you need to prove it using “index notation”, and therefore to limit yourself to orthonormal (“cartesian”) bases only or to deal with differentiation of basis vectors. Really, you don’t need to expand vectors, expanding just nabla ${\boldsymbol{\nabla} = \boldsymbol{r}^i \partial_i}$
Thus at first here’s how I prove it, simply and clear enough I bet
$$ \boldsymbol{\nabla} \cdot \bigl( \boldsymbol{a} \boldsymbol{b} \bigr) = \boldsymbol{r}^i \partial_i \cdot \bigl( \boldsymbol{a} \boldsymbol{b} \bigr) = \boldsymbol{r}^i \cdot \partial_i \bigl( \boldsymbol{a} \boldsymbol{b} \bigr) = \boldsymbol{r}^i \cdot \bigl( \partial_i \boldsymbol{a} \bigr) \boldsymbol{b} + \boldsymbol{r}^i \cdot \boldsymbol{a} \bigl( \partial_i \boldsymbol{b} \bigr) = $$ $$ = \bigl( \boldsymbol{r}^i \cdot \partial_i \boldsymbol{a} \bigr) \boldsymbol{b} + \boldsymbol{a} \cdot \boldsymbol{r}^i \bigl( \partial_i \boldsymbol{b} \bigr) = \bigl( \boldsymbol{r}^i \partial_i \cdot \boldsymbol{a} \bigr) \boldsymbol{b} + \boldsymbol{a} \cdot \bigl( \boldsymbol{r}^i \partial_i \boldsymbol{b} \bigr) = \left( \boldsymbol{\nabla} \cdot \boldsymbol{a} \right) \boldsymbol{b} + \boldsymbol{a} \cdot \bigl( \boldsymbol{\nabla} \boldsymbol{b} \bigr) $$
All that I use here besides expansion of nabla are (i) the “product rule”, (ii) the commutativity of dot product of any two vectors, (iii) the fact that dot product doesn't affect scalars (and coordinate derivative $\partial_i \equiv \frac{\partial}{\partial q^i}$ is scalar), dot product affects only vectors and tensors of even bigger complexity
But if you wish the full expansion so badly, I can give you it. For orthonormal measuring (“cartesian”), basis vectors $\boldsymbol{e}_i$ are mutually perpendicular to each other and their lengths are equal to the one unit (of some chosen linear units of measurement), that is $\boldsymbol{e}_i \cdot \boldsymbol{e}_j = \delta_{ij}$. In addition, they are constant (don't vary from point to point). Thence coordinate derivative $\partial_i \equiv \frac{\partial}{\partial x_i}$ of some vector $\boldsymbol{a} = a_i \boldsymbol{e}_i$ is
$$ \partial_i \boldsymbol{a} = \partial_i \bigl( a_j \boldsymbol{e}_j \bigr) = \bigl( \partial_i a_j \bigr) \boldsymbol{e}_j = \partial_i a_j \boldsymbol{e}_j $$
For non-orthonormal bases, there’re two complementary sets of basis vectors, $\boldsymbol{r}_i \equiv \partial_i \boldsymbol{r}$ (where $\boldsymbol{r}(q^i[, t])$ is position vector of a point) and $\boldsymbol{r}^i$ for which ${\boldsymbol{r}^i \cdot \boldsymbol{r}_j = \delta^i_j}$ (${\boldsymbol{r}_i \cdot \boldsymbol{r}^j = \delta_i^j}$). They are not constant from point to point, and the coordinate derivative of some vector $\boldsymbol{a}$ being measured using such a basis as ${\boldsymbol{a} = a_i \boldsymbol{r}^i}$ or as ${\boldsymbol{a} = a^i \boldsymbol{r}_i}$ is
$$ \partial_i \boldsymbol{a} = \partial_i \bigl( a_j \boldsymbol{r}^j \bigr) = \bigl( \partial_i a_j \bigr) \boldsymbol{r}^j + a_j \bigl( \partial_i \boldsymbol{r}^j \bigr) $$ or $$ \partial_i \boldsymbol{a} = \partial_i \bigl( a^j \boldsymbol{r}_j \bigr) = \bigl( \partial_i a^j \bigr) \boldsymbol{r}_j + a^j \bigl( \partial_i \boldsymbol{r}_j \bigr) $$
I’m not going to give the full component expansion for such bases yet, and I hope you can do it yourself (having enough time). Here is the expansion of the above proof when you measure vectors using some orthonormal basis
$$ \boldsymbol{\nabla} \cdot \bigl( \boldsymbol{a} \boldsymbol{b} \bigr) = \boldsymbol{e}_k \partial_k \cdot \bigl( \boldsymbol{a} \boldsymbol{b} \bigr) = \boldsymbol{e}_k \cdot \partial_k \bigl( a_i \boldsymbol{e}_i \, b_j \boldsymbol{e}_j \bigr) = \boldsymbol{e}_k \cdot \bigl( \partial_k a_i \boldsymbol{e}_i \bigr) b_j \boldsymbol{e}_j + \boldsymbol{e}_k \cdot a_i \boldsymbol{e}_i \bigl( \partial_k b_j \boldsymbol{e}_j \bigr) = $$ $$ = \bigl( \boldsymbol{e}_k \cdot \partial_k a_i \boldsymbol{e}_i \bigr) b_j \boldsymbol{e}_j + \boldsymbol{e}_i \cdot \boldsymbol{e}_k a_i \bigl( \partial_k b_j \boldsymbol{e}_j \bigr) = \bigl( \delta_{ki} \partial_k a_i \bigr) \boldsymbol{b} + \delta_{ik} a_i \bigl( \partial_k \boldsymbol{b} \bigr) = \left( \boldsymbol{\nabla} \cdot \boldsymbol{a} \right) \boldsymbol{b} + \boldsymbol{a} \cdot \bigl( \boldsymbol{\nabla} \boldsymbol{b} \bigr) $$
As you may see, there’s not a much new, but much more letters.
Post scriptum: you’re also welcome to take a look at another my answer Gradient of cross product of two vectors (where first is constant)
Since $f=\frac{1}{2}(A_{ij}x_j-b_i)(A_{ik}x_k-b_i)$, $\partial_l f=A_{il}(A_{ik}x_k-b_i)$. Contracting out the repeated index $i$ by the definition of matrix multiplication requires $A_{il}$ to be written as $A^T_{li}$, so $\partial_l f = A^T_{li}(Ax-b)_i=(A^TAx-A^T b)_l$.
Best Answer
The gradient of a scalar field is contravariant (known as one-form). In index notation $$[grad(f)]^n=\nabla^nf=g^{rn} \nabla_{r}f=g^{rn}\partial_{n}f \tag{1}$$ where $\nabla_n $ is the covariant derivative and $g^{ij}$ is the contravriant metric tensor, and we have used the fact that $\nabla_nf=\partial_n f$ for a scalar field.
Your function $f(x^j)=|\bf{r}|^4$ can be written as $$f=(g_{ij}x^{i}x^j)^2$$ Now $$\partial_nf=\partial_n(g_{ij}x^ix^j)^2=2(g_{ij}x^ix^j)((\partial_ng_{ij})x^ix^j+g_{ij}(\delta^i_nx^j+x^i\delta^j_n))$$ $$=2(g_{ij}x^ix^j)((\partial_ng_{ij})x^ix^j+2g_{nj}x^j)$$
$$g^{rn}\partial_nf=2(g_{ij}x^ix^j)(g^{rn}(\partial_ng_{ij})x^ix^j+2x^r )$$ So the answer is dependent on your metric.
Assuming that we're working under the Cartesian metric, $g_{ij}=\delta_{ij}=constant$, the $\partial_ng_{ij} $ term vanishes, then $$[grad(f)]^r=4g_{ij}x^ix^jx^r=4x_jx^jx^r$$ $$$$