When can a vector field be rescaled to have constant divergence

differential-geometrydifferential-topologydynamical systemsordinary differential equationsvector analysis

Suppose $X$ is a smooth vector field on $\mathbb{R}^n$ having divergence $$\nabla \cdot X < 0$$ everywhere. Does there always exist a positive smooth "rescaling" function $g:\Bbb R^n \to (0,\infty)$ such that $$\nabla\cdot (g X) \equiv -1?$$

(This question appears slightly related, but different.)

Best Answer

No, this fails in all dimensions $n \geq 1$: Consider any negative, strictly decreasing ($C^1$) function $f$ ($f(x) = -\exp x$ will do), and take $X := f \partial_{x^1}$. Then, the condition $\nabla \cdot (g X) = -1$ is $(g f)' = -1$, and integrating and rearranging gives that $$g = -\frac{x^1 - j(x^2, \ldots, x^n)}{f}$$ for some $j$. But $g(j(x^2, \ldots, x^n), x^2, \ldots, x^n) = 0$, a contradiction.

The above example suggests, too, that the condition on the global sign of $g$ is not a very natural one. But if we drop this hypothesis, i.e., allow $g$ to take on nonpositive values, the statement still seems to be false. (It is true, though, in the case $n = 1$---see below.) In fact, the example from your answer to the linked question seems to be a counterexample here, too: Take $n := 2$ and $$X := \sin x \,\partial_x - 2 y \,\partial_y .$$ Then, expanding the equation $$\nabla \cdot (g X) = -1$$ in terms of the standard frame gives the p.d.e. $$g_x \sin x + g \cos x - 2 y g_y - 2 g = - 1 .$$ Evaluating at $y = 0$ and denoting $h(x) := g(x, 0)$ then gives the o.d.e. $$h' \sin x + h \cos x - 2 h = -1 .$$ This o.d.e. is singular at odd integral multiples of $\pi$. The unique solution of this equation that is continuous at $x = \pi$ is $$h(x) = \frac{\sin(x) (x - \pi) + 2 (\cos x + 1)}{(\cos x + 1)^2} \sim \frac{1}{3} + \frac{1}{30} (x - \pi)^2 + O((x - \pi)^4)$$ (where we've implicitly removed the singularity on the r.h.s. of the equality), but this function does not extend continuously outside the interval $(-\pi, 3 \pi)$, so there is no function $g$ defined on all of $\Bbb R^2$ satisfying the condition.

On the other hand, certain adjustments of the statement are true:

  1. If $n = 1$, then $X = f(x) \partial_x$ and by hypothesis $\nabla \cdot X = f'(x) < 0$. Then, the condition $$\nabla \cdot (g X) = -1$$ becomes $$(g f)' = -1,$$ and integrating gives $$g f = -x + C .$$ Since $f$ is strictly decreasing, it has at most one zero. If it does, say, at $x_0$, then setting $C = x_0$ we can take $g = -\frac{(x - x_0)}{f}$---by hypothesis the singularity of $g$ at $x_0$ is removable. If $f$ has no zero, we can take $g = -\frac{(x - C)}{f}$ for any $C$.

  2. Then there are always local solutions $g$ around any point $p$ at which $X$ does not vanish. More precisely, if $X_p \neq 0$ for some $p \in \Bbb R^n$, then we can choose local coordinates $(y^a)$ near $p$ in which $X = \partial_{y^1}$. The volume form in these coordinates is $v \, dy^1 \wedge \cdots \wedge dy^n$ for some positive function $v$, and the condition $\nabla \cdot (g X)$ becomes $$\frac{\partial}{\partial y^1}(g v) = - 1,$$ which admits the solutions $$g = -\frac{y^1 - j(y^2, \ldots, y^n)}{v} ,$$ where $j$ is an arbitrary $C^1$ function.