Why a vector field isn’t determined solely by its divergence

calculusgrad-curl-divphysicsvectors

I am an undergraduate physics student. I would prefer answers that share more information and clarification rather than try to be concise.

To my understanding, Helmholtz theorem tells us that a vector function can be decomposed into the sum of a gradient of a scalar and curl of a vector function. Thus, if you know the divergence of a vector function is non-zero and also that the curl of the same function is zero, you can say that the vector function is wholly determined by the non-zero quantity. This occurs in electrostatics with the vector function for electric field. I understand this decomposition, but I would like to gain a better intuition as to why both quantities (divergence and curl) are needed to determine the vector field.

Suppose we have a vector function ${\textbf{D}}$ with divergence given by $\nabla \cdot \textbf{D} = \rho_{f}$. Let's say its curl is non-zero: $\nabla \times \textbf{D} = \nabla \times \textbf{P} \not= \textbf{0}$. The symbols aren't important, but they represent electric displacement ($\textbf{D}$), polarization ($\textbf{P}$), and free charge density ($\rho_{f}$) from 4ed Griffith's Introduction to Electrodynamics.

Is it accurate to draw parallels between the relationship of the derivative and integral of a scalar function and the divergence and anti-divergence (if such thing exists) of a vector function?

For example, let $$y = 1$$ Then, $$\frac{dy}{dx} = 0$$ And, $$\int{\frac{dy}{dx}} = C$$ where C is a constant.

Obviously, in the scalar case the anti-derivative does not have/provide enough information for us to go back to the original function $y$. Is this what is happening in the vector function case as well?

It seems I might be conflating boundary conditions (which if provided in the scalar case would give us a unique answer) and something else. Any help would be much appreciated!

Best Answer

$\def\rbf{\mathbf{R}}$For simplicity consider a smooth function $f:\rbf\to\rbf$. Suppose $f'(x)=g(x)$ for all $x$. Then for any constant $C$, one also has $(f+C)'=g$. So the derivative $f'$ does not determine the function $f$ completely. On the other hand, the derivative determines the function "partially". If one knows that $f'=g'$, then one must have $f=g+C$ by the mean value theorem. One can replace the domain $\rbf$ of $f$ with any open interval.

["Divergence" is misread as "gradient" in this part.]
$\def\bg{\mathbf{g}}$Now let $f:\rbf^3\to\rbf$ be a scalar function and suppose $\nabla f(x)=\bg(x)$ for all $x$. Then $\nabla(f+C)=\bg$ for any constant $C$. So the gradient $\nabla f$ does not determine the function $f$ completely. On the other hand, if $\nabla p = \nabla q$ for two scalar functions $p$ and $q$, then similarly, $p=q+C$ for some constant $C$. The result holds if one replaces $\rbf^3$ with any nonempty connected open subset of $\rbf^3$.
In the scalar case, by the fundamental theorem of calculus, an antiderivative of the function $g$ is given by $$ f(x)=\int_0^xg(t)\;dt\;. $$ In the vector-valued function case, an "anti-gradient" of the vector field $\bg$, assuming that $\textrm{curl} (\bg)=0$, can be found as the line integral $$ f(x,y,z)=\int_{(0,0,0)\rightsquigarrow(x,y,z)} \bg\cdot dr $$ where $(0,0,0)\rightsquigarrow(x,y,z)$ denotes any path from $(0,0,0)$ to $(x,y,z)$.


[Added later.]

Let $\lambda:\rbf^3\to\rbf$ be a scalar field on $\rbf^3$ and $F:\rbf^3\to\rbf^3$ a vector field with $$ \nabla F=\lambda\;.\tag{1} $$ Then every vector field $F_1:=F+C$ for some constant $C$ satisfies (1). So divergence alone does not determine the vector field. This answers your question in the title.

A theorem says that a vector field can be constructed with both a specified divergence and a specified curl: $$ \nabla\cdot F=\lambda,\quad \nabla\times F=q\tag{2} $$ where $q$ is a given divergence-free vector field. But since (2) is still invariant up to a constant, $F$ is not determined uniquely even by its divergence and curl. One needs an additional "decay property" of $F$ for uniqueness. That property can be thought of as some "boundary condition" to the system in (2).