Is divergence free property preserved here

grad-curl-divjacobianmultivariable-calculuspartial differential equationsreal-analysis

Suppose I have a PDE

$$F(x)\cdot \nabla f(x) = 0 $$

with $f:\mathbb{R}^n\mapsto \mathbb{R}$ and $F:\mathbb{R}^n\mapsto \mathbb{R}^n$ and $F\in C^\infty$ and$f$ is unknown and has appropriate boundary/initial conditions. Further suppose that

$$\nabla\cdot F \equiv 0$$

Consider a variable change

$$\tilde{f} := f\circ \eta^{-1}, \quad f = \tilde{f}\circ \eta $$
such that
$$|D\eta|(x) = 1, \ \forall x\in\mathbb{R}^n$$
and furthermore
$$(D\eta)^{-1} = D\eta^{-1} = (D\eta)^T$$
So that $\eta$ is some smooth, invertible, orthogonal transformation whose Jacobian $D\eta$ has determinant equal to 1. Then it follows
$$F(x) \cdot (D\eta)^T (\nabla \tilde{f} \circ\eta) = 0$$
or equivalently
$$\bigg[[(D\eta)F]\circ \eta^{-1}(x)\bigg] \cdot \nabla \tilde{f} = 0$$
Question:

Is it necessarily true that the divergence free property
$$\nabla \cdot \bigg[[(D\eta)F]\circ \eta^{-1}\bigg] \equiv 0$$
is preserved? If not, what conditions can we impose on $\eta$ to make this hold? This post might help, but I am still struggling to prove it
Divergence of matrix-vector product

Best Answer

Let's use index notation: $$ \mathbf{F}\cdot \nabla_x f = F_i(x)\frac{\partial f}{\partial x_i} = \left[F_i(x(\eta))\frac{\partial \eta_j}{\partial x_i}\right]\frac{\partial \tilde{f}}{\partial \eta_j} = 0 $$ The term in brackets I'll call $\tilde{\mathbf{F}}$, and we want to take the $\eta$-divergence of it: \begin{multline} \nabla_\eta\cdot \tilde{\mathbf{F}}=\frac{\partial}{\partial \eta_j}\left[F_i(x(\eta))\frac{\partial \eta_j}{\partial x_i}\right] = \frac{\partial}{\partial \eta_j}\left[F_i(x(\eta))\right]\frac{\partial \eta_j}{\partial x_i} + F_i(x(\eta))\frac{\partial}{\partial \eta_j}\left[\frac{\partial \eta_j}{\partial x_i}\right]\\ = \frac{\partial F_i}{\partial x_k}\frac{\partial x_k}{\partial \eta_j}\left[\frac{\partial x_j}{\partial \eta_i}\right]^{-1} + F_i\frac{\partial x_k}{\partial \eta_j}\left[\frac{\partial^2 \eta_j}{\partial x_k \partial x_i}\right]\\= \frac{\partial F_i}{\partial x_i} + F_i \left[\frac{\partial \eta_k}{\partial x_j}\right]^{-1}\frac{\partial}{\partial x_i}\left[\frac{\partial \eta_k}{\partial x_j}\right] = \nabla_x \cdot \mathbf{F} + \left(\mathbf{J}^{-1}:\frac{\partial\mathbf{J}}{\partial x_i}\right) F_i, \end{multline}

where we have used the fact that the Jacobian matrices $\mathbf{J} = [\nabla_x \eta]_{ij} = [(\partial \eta_i/\partial x_j)]$ and $\mathbf{J}^{-1} = [\nabla_\eta x]_{ij} = [(\partial x_i/\partial \eta_j)]$ are inverses. Next we use Jacobi's Formula, $$ \partial \,\mathrm{det}[\mathbf{A}] = \mathrm{det}[\mathbf{A}]\left(\mathbf{A}^{-1}:\partial \mathbf{A}\right), $$ to simplify the last term, giving $$ \nabla_\eta\cdot \tilde{\mathbf{F}}= \nabla_x \cdot \mathbf{F} + \mathbf{F}\cdot\nabla \ln |J|, $$ where $J = \mathrm{det}[\mathbf{J}]$ is the Jacobian determinant. So any transform $\eta$ with a constant nonzero Jacobian determinant will preserve the divergence of $\mathbf{F}$.

Related Question