[Math] Divergence theorem for tensor fields on manifolds

differential-geometryriemannian-geometrytensors

Let $M$ be a Riemannian manifold and let $T$ be a $k$-tensor over $M$. I am looking for an analogon of the divergence theorem on manifolds.

I would expect something of the form

$\int_U \operatorname{Div} T = \int_{\partial U} \vec n \lrcorner T$

to be valid for some operator $\operatorname{Div}$ yet to be determined, where $U \subseteq M$ is a smoothly bounded subset of the manifold $M$, where $\vec n \lrcorner T$ is the contraction of $T$ with the outer unit normal along the boundary of $U$.

I am wondering in particular about the nature of the ominous operator $\operatorname{Div}$. A reference to a proof in the literature is sufficient.

Best Answer

In order for your equation to make sense, $T$ has to be a tensor field whose divergence is a differential form (alternating tensor) with rank equal to the dimension of the manifold, and the manifold has to be oriented. Moreover, to define the divergence and unit normal, you're going to need a Riemannian metric. So let's assume $(M,g)$ is an oriented Riemannian $n$-manifold, and $T$ is a smooth section of either $\Lambda^n M \otimes TM$ or $\Lambda^n M\otimes T^*M$. Since we have a Riemannian metric, these two bundles are naturally isomorphic via the musical isomorphisms, so let's choose $\Lambda^n M \otimes TM$.

The divergence of such a tensor is defined as $\operatorname{div} T = \operatorname{tr}(\nabla T)$, where the trace is on the last two indices -- the contravariant $TM$ index, and the covariant $T^*M$ index introduced by $\nabla$.

Any section of $\Lambda^n M\otimes TM$ can be written globally as $T = dV_g \otimes X$ for some vector field $X$, and then the fact that $dV_g$ is parallel implies $$ \operatorname{div}T = \operatorname{tr} (dV_g \otimes \nabla X) = dV_g \otimes (\operatorname{tr} \nabla X) = (\operatorname{div} X) dV_g. $$ Then the divergence theorem for vector fields on oriented Riemannian manifolds with boundary (see Theorem 16.32 in my Introduction to Smooth Manifolds) implies $$ \int_U \operatorname{div}T = \int_U (\operatorname{div} X) dV_g = \int_{\partial U} \langle X, \vec n \rangle\,dV_g = \int_{\partial U} \vec n \lrcorner T. $$ But this is really nothing but the ordinary divergence theorem dressed up in different notation.

Related Question