[Math] Finding the Gradient of a Vector Function by its Components

multivariable-calculusreal-analysistensorsvector analysisvectors

In Multivariable Calculus, we can easily find the gradient of a scalar function (producing a scalar field) $f : \mathbb{R^n} \to \mathbb{R}$, and the gradient function would produce a vector field.

$$grad(f) = \vec{\nabla}(f) = \left< \frac{\partial f}{\partial x_1}, \frac{\partial f}{\partial x_2} , … , \frac{\partial f}{\partial x_n} \right> = \begin{bmatrix}
\frac{\partial f}{\partial x_{1}} \\
\frac{\partial f}{\partial x_{2}} \\
… \\
\frac{\partial f}{\partial x_{n}}
\end{bmatrix}$$

Evaluating Vector Functions By Components

In Multivariable Calculus we learn that we can differentiate any vector function by taking the derivatives of its scalar components/functions, likewise we also learn that we can integrate any vector function by integrating each of its scalar components.

e.g.

Given a function $g : \mathbb{R^n} \to \mathbb{R^m}$, comprised of scalar functions $f_{i} : \mathbb{R^n} \to \mathbb{R} $

$${\vec{g}}{'}(t) = \left< {f_{1}}^{'}(t), {f_{2}}^{'}(t), …, {f_{n}}^{'}(t)\right> = \begin{bmatrix}
{f_{1}}^{'}(t) \\
{f_{2}}^{'}(t) \\
… \\
{f_{n}}^{'}(t)
\end{bmatrix}$$

$$\int\vec{g}(t) = \left< \int{f_{1}}^{'}(t) \ , \int{f_{2}}^{'}(t)\ , \ …, \ \int{f_{n}}^{'}(t)\right> = \begin{bmatrix}
\int{{f_{1}}^{'}(t)} \\
\int{f_{2}}^{'}(t) \\
… \\
\int{f_{n}}^{'}(t)
\end{bmatrix}$$

Can we do the same for the Del Operator?

Since we can differentiate an integrate any vector function, by taking the derivatives or integrals of its scalar components/functions, can we evaluate the gradient of a vector function by applying the Del Operator to each of it's scalar components to compute the gradient of each scalar function producing a scalar field. I realize that this would produce a Tensor field as a result.

Again given the same vector function $g : \mathbb{R^n} \to \mathbb{R^m}$, comprised of scalar functions $f_{i} : \mathbb{R^n} \to \mathbb{R} $, can we say the following :

$$ T = grad(\vec{g}) = \vec{\nabla}(\vec{g}) = \left< \vec{\nabla}(f_1), \vec{\nabla}(f_2), …, \vec{\nabla}(f_n) \right> = \begin{bmatrix}
\vec{\nabla}(f_1) \\
\vec{\nabla}(f_2) \\
… \\
\vec{\nabla}(f_n)
\end{bmatrix}$$

With $T$ denoting the tensor field outputted by taking the gradient of the vector field produced by the function $g$


Just to close off, I realize that a vector function, can take both vectors or scalars as inputs, and my question here only covers the case for scalar inputs to a vector function, however, extending this to vector inputs would be a fairly trivial task as we could just break up the vector inputs into its scalar components and then work from there, which we would now know how to do as that is covered within the scope of this question.

Best Answer

If $\vec{g}=\left[\begin{array}{c}f_1\\\vdots\\f_m\end{array}\right]$ then the derivative of $\vec{g}$ is the matrix
$$J\vec{g}=\left[\begin{array}{c}\nabla f_1\\\vdots\\\nabla f_m\end{array}\right],$$ which is an $m\times n$ - rectangular array.

In components, you would see it as $$J\vec{g}=\left[\dfrac{\partial f_i}{\partial x_j}\right],$$ where $i$ is for rows and $j$ is for columns, and where $x_1,...,x_n$ are the standard coordinate functions of $\Bbb R^n$.