The meaning of column-wise gradient and symmetric gradient

jacobianmatricesmatrix-calculussymmetric matricesvectors

I was studying a literature where in an equation the term $\nabla p_i$ is given and further it is given that $\nabla$ is a column wise gradient and $p_i = p_i(x,t)$, what exactly meaning of column-wise gradient?.

Further it is seen in literature that $p_i$ is a scalar quantity and I found one definition of matrix calculus on Wikipedia (Scalar by Vector), where it can observed that derivative of a scalar $p_i$ with respect to independent vector is a row vector
$$\left[\frac{\partial p_i}{\partial x} \ \ \ \frac{\partial p_i}{\partial t}\right]$$
And further, as in general $\nabla p_i$ is a Jacobian of scalar $p_i$ and we can find definition Jacobia Matrix and determinants in which it can seen that
$$\nabla p_i = \frac{\partial p_i}{\partial x_j} \ \ \ \text{for} \ \ j = 1,2 \ \ \ \text{and} \ \ \ x_1 = x, \ \ x_2 = t$$
if $i = 1,2,…,n$ then $\nabla p_i$ will be an $n \times 2$ matrix. And
$$\nabla p_i = \begin{bmatrix}
\frac{\partial p_1}{\partial x} & \frac{\partial p_1}{\partial t} \\
\frac{\partial p_2}{\partial x} & \frac{\partial p_2}{\partial t} \\
\vdots & \vdots \\
\frac{\partial p_3}{\partial x} & \frac{\partial p_3}{\partial t}
\end{bmatrix}$$

Can we call it column-wise gradient? If not, then what is column-wise gradient and how we can express it in general form?

My next question is here about Symmetric Gradient. While googling, I found one expression of symmetric gradient showing curl curl of… where symmetric gradient $\epsilon$ of a two component vector $u(x,t) = (u_1(x,t),u_2(x,t))$ is given as (with some modification form)
$$\epsilon(u) = \begin{bmatrix}
\frac{\partial u_1}{\partial x} & \frac{1}{2}\left(\frac{\partial u_1}{\partial t} + \frac{\partial u_2}{\partial x}\right) \\
\frac{1}{2}\left(\frac{\partial u_1}{\partial t} + \frac{\partial u_2}{\partial x}\right) & \frac{\partial u_2}{\partial t} \\
\end{bmatrix}$$

But, how we can write symmetric gradient in more generalized form e.g. for $u(x,t) = (u_1(x,t),u_2(x,t),…,u_n(x,t))$ for $n > 2$

Will symmetric gradient always be a square matrix?

Best Answer

The gradient of a vector has two “degrees of freedoms”: one is iterating over the components of vector, the other one is iterating over the coordinates we differentiate by. There is no universal rule, which one should be rows and which one should be columns (although column-wise gradient is more popular and more logical).

Your formula for column-wise gradient is correct.

If the dimensionality of domain matches the dimensionality of the vector, the gradient is a square matrix and can be symmetrized, resulting in symmetric gradient: $$ \frac{\pmb\nabla\mathbf{p}+(\pmb\nabla\mathbf{p})^\intercal}2,\qquad \text{or in index form}\qquad \frac{\nabla_jp_i+\nabla_ip_j}2=\nabla_{(j}p_{i)} $$