[Math] Every divergence-free vector field generated from skew-symmetric matrix

real-analysisVector Fields

Let $[a_{i,j}(x_1,\ldots,x_n)]$ be a skew-symmetric $n\times n$ matrix of functions $a_{i,j}\in C^\infty(\mathbb{R}^n)$. The vector field $$v=\sum\left(\dfrac{\partial}{\partial x_i}a_{i,j}\right)\dfrac{\partial}{\partial x_j}$$ is divergence-free.

Prove by induction that for every $n\geq 2$, every $C^\infty$ divergence-free vector field on $\mathbb{R}^n$ is of this form.

Consider $n=2$. Suppose the vector field is $f_1(x_1,x_2)\dfrac{\partial}{\partial x_1}+f_2(x_1,x_2)\dfrac{\partial}{\partial x_2}$. Since the vector field is divergence-free, we have that $\dfrac{\partial}{\partial x_1}f_1(x_1,x_2)+\dfrac{\partial}{\partial x_2}f_2(x_1,x_2)=0$. By this result, there exists a function $g(x_1,x_2)$ whose $x_1$-derivative equals $f_2$ and whose $x_2$-derivative equals $-f_1$. The result follows.

But how about for $n>2$? To use induction, I have to relate a divergence-free vector field of $\mathbb{R}^n$ to a divergence-free vector field of $\mathbb{R}^{n-1}$. It is possible that the following result will help:

Let $v$ be a vector field on $\mathbb{R}^n$. Show that $v$ can be written as a sum $v=f_1\dfrac{\partial}{\partial x_1}+w$ where $w$ is a divergence-free vector field.

Best Answer

Induction Hypothesis over $k$: Given a smooth divergence free vector field $v:\mathbb R^n \to \mathbb R^n$ such that $\text{div}_{k} v := \sum_{i=1}^k \frac{\partial v_i}{\partial x_i} = 0$, there exists a smooth skew-symmetric matrix $a:\mathbb R^n \to \mathbb R^{n\times n}$ such that $v_j = \sum_{i=1}^k \frac\partial{\partial x_i} a_{ij}$ for $1 \le i \le k$.

The case $k=0$ is trivial.

Suppose it is true for $k-1 \ge 0$. We prove it for $k$.

Let $$ f_1(x_1,\dots,x_n) = \int_0^{x_1} \frac{\partial}{\partial x_k} v_k(\xi,x_2,x_1,\dots,x_n) \, d\xi .$$ Then $$ \frac{\partial}{\partial x_1}(v_1+f_1) + \frac{\partial}{\partial x_2}v_2 + \dots + \frac{\partial}{\partial x_{k-1}}v_{k-1} = 0 .$$

By the inductive hypothesis, there is a skew symmetric matrix $a_{ij}$ such $$ v_1 + f_1 = \sum_{i=1}^{k-1} \frac{\partial}{\partial x_i} a_{i1} $$ $$ v_j = \sum_{i=1}^{k-1} \frac{\partial}{\partial x_i} a_{ij} \quad \text{ for $2 \le j \le k-1$}$$

We define $$ f_2(x_1,\dots,x_n) = \int_0^{x_1} v_k(\xi,x_2,\dots,x_{k-1},0,x_{k+1},\dots,x_n) \, d\xi - \int_0^{x_k} f_1(x_1,\dots,x_{k-1},\xi,\dots,x_n) \, d\xi .$$ Then $$ \frac\partial{\partial x_1} f_2 = v_k(x_1,\dots,x_{k-1},0,\dots,x_n) - \int_0^{x_k} \frac\partial{\partial x_k} v_k(x_1,\dots,x_{k-1},\xi,\dots,x_n) \, d\xi = -v_k $$ and $$ \frac\partial{\partial x_k} f_2 = - f_1 $$ Now extend the matrix $a$ by setting $a_{k1} = -a_{1k} = f_2$, and $a_{kj}=a_{jk} = 0$ for $2 \le j \le k$. Then $$ \sum_{i=1}^{k} \frac{\partial}{\partial x_i} a_{i1} = v_1 + f_1 + \frac\partial{\partial x_k} f_2 = v_1, $$ $$ \sum_{i=1}^{k} \frac{\partial}{\partial x_i} a_{j1} = v_j \quad \text{ for $2 \le j \le k-1$}, $$ and $$ \sum_{i=1}^{k} \frac{\partial}{\partial x_i} a_{ik} = - \frac\partial{\partial x_1} f_2 = v_k. $$

Related Question