$\sum_{j=1}^n \frac{\partial}{\partial x_j} (\text{cof}(Df))_{ij}=0$ for a $C^\infty$ function $f:\Bbb R^n\to \Bbb R^n$

analysisintegrationlinear algebrareal-analysissmooth-functions

Let $f,g:\Bbb R^n \to \Bbb R^n$ be two $C^\infty$ functions. I am trying to prove the following statements:

(1) $\displaystyle\sum_{j=1}^n \frac{\partial}{\partial x_j} (\text{cof}(Df))_{ij}=0$ $(1\leq i\leq n)$, where $Df$ is the derivative of $f$ (with $ij$-entry given by $\frac{\partial f_i}{\partial x_j}$), and $\text{cof}(A)$ is the cofactor matrix of $A$.

(2) If $U$ is a bounded open connected subset of $\Bbb R^n$ having smooth boundary, and if $f=g$ on $\partial U$, then $\int_U \det(Df)dx=\int_U \det (Dg)dx$.

For (1), by definition of the cofactor matrix, we have $(\text{cof}(Df))_{ij}= (-1)^{i+j} \frac{\partial f_i}{\partial x_j}\det(M_{ij})$, where $M_{ij}$ is the $ij$-minor of $\text{cof}(Df)$. But I can't see how to proceed.

For (2), I think I should use some kind of Stoke's theorem, but I have no idea.

Any hints for these? Thanks in advance.

Best Answer

We'll show $(2)$ using the Gauß divergence theorem. Our goal is to represent $\det (Df)$ and $\det(Dg)$ as the divergence of a vector field. In the following I'll use $f_{x_i} := \frac{\partial f}{\partial x_i}$.

Notice that $$\beta := \det: \underbrace{\mathbb{R^n} \times \dots \times \mathbb{R}^n}_{\text{n times}} \to \mathbb{R}, (a_1, \dots, a_n) \mapsto \det(a_1, \dots, a_n) $$ is a multilinear function, therefore $$D\beta(a_1,\dots, a_n)(y_1, \dots, y_n) = \sum\limits_{j=1}^n \beta(a_1,\dots, a_{j-1}, y_j, a_{j+1}, \dots, a_n)$$

Let $\alpha_i : \mathbb{R}^n \to \underbrace{\mathbb{R^n} \times \dots \times \mathbb{R}^n}_{\text{n times}}, x \mapsto \big( f_{x_1}(x), \dots, f_{x_{i-1}}(x), f(x), f_{x_{i+1}}(x), \dots, f_{x_n}(x) \big)$.

Let $F: \mathbb{R}^n \to \mathbb{R}^n, x \mapsto (F_1(x), \dots, F_n(x))$ with $F_i := \beta \circ \alpha_i = \det\big(f_{x_1}, \dots, f_{x_{i-1}}, f, f_{x_{i+1}}, \dots, f_{x_n}\big)$

Now $$\frac{\partial F_i}{\partial x_i}(x) = D\beta\Big(\alpha_i(x)\Big)\Big(\frac{\partial \alpha_i}{\partial x_i} (x)\Big)$$

Therefore

$$\frac{\partial F_i}{\partial x_i}= \det\Big(f_{x_1x_i}, \dots, f_{x_{i-1}}, f, f_{x_{i+1}}, \dots, f_{x_n} \Big) + \det\Big(f_{x_1}, f_{x_2 x_i} \dots, f_{x_{i-1}}, f, f_{x_{i+1}}, \dots, f_{x_n} \Big) + \dots + \det\Big(f_{x_1}, \dots, f_{x_{i-1}}, f, f_{x_{i+1}}, \dots, f_{x_nx_i} \Big)$$

Notice that exactly one term in this sum is $$\det\Big(f_{x_1}, \dots, f_{x_{i-1}}, f_{x_i}, f_{x_{i+1}}, \dots, f_{x_n}\Big) = \det(Df)$$

The other terms are the determinants ($i \neq j$)

$$d_{i,j} := \det\Big( f_{x_1}, \dots, f_{x_{j-1}}, f_{x_j x_i}, f_{x_{j+1}}, \dots, f_{x_{i-1}}, f, f_{x_{i+1}}, \dots, f_{x_n} \Big)$$

Since the determinant function is alternating we have $d_{i,j} = -d_{j,i}$ and this leads us (due to cancellation) to

$$\text{div}\ F = \sum\limits_{i=1}^n \frac{\partial F_i}{\partial x_i} = n \cdot \det(Df) + \sum\limits_{i=1}^n \sum\limits_{j \neq i} d_{i,j} = n \cdot \det(Df)$$

Now we define $G: \mathbb{R}^n \to \mathbb{R}^n$ similar to $F$ but with $g$ instead of $f$, do the same steps again and get that, since $f = g$ on $\partial U$, $\text{div}\ F = \text{div}\ G$ on $\partial U$. Now we apply the Gauß divergence theorem:

$$\int_U \det(Df)\ d\lambda_n = \frac{1}{n} \int_U \text{div}\ F\ d\lambda_n = \frac{1}{n} \int_{\partial U} \langle F, \nu \rangle\ dS_{\partial U}$$ $$ = \frac{1}{n} \int_{\partial U} \langle G, \nu\rangle\ dS_{\partial U} = \frac{1}{n} \int_U \text{div}\ G d\lambda_n = \int_U \det(Dg)\ d\lambda_n$$

I haven't figuered $(1)$ out yet, but possibly it'll help again to differentiate the determinant function as a multilinear function. I'm sorry this is not a complete answer but it was definitely too long for a comment.


Edit: Solution for $(1)$

Instead of deleting the $i$th row and $j$th column, building the determinant and multiplying by $(-1)^{i+j}$, you can also build the determinant of the matrix in which you replace every entry in the $i$th row and $j$th column with $0$ except entry $(i,j)$ which you replace by 1. (if I explained this too bad, take a look at the german wikipedia, there's a picture of how it looks like). I'll use $\partial_i := \frac{\partial}{\partial x_i}$ now.

Let $$u_l := \big( \partial_l f_1, \dots ,\partial_l f_{i-1}, 0, \partial_l f_{i+1}, \dots, \partial_l f_n \big)^T$$ $$\alpha_j := (u_1, \dots, u_{j-1}, e_i, u_{j+1}, \dots, u_n)$$

Now (similar to the differentiation for $(2)$)

$$\frac{\partial}{\partial x_j} (\text{cof}(Df))_{i,j} = \frac{\partial}{\partial x_j} (\det \circ \alpha_j) = \sum\limits_{k = 1 \\ k \neq j}^n \det\big(u_1, \dots, \partial_j u_k, \dots, u_n\big)$$

(we exclude $k=j$ because $\partial_j e_i = 0$ and therefore the determinant would be 0)

$$\sum\limits_{j=1}^n \frac{\partial}{\partial x_j} (\text{cof}(Df))_{i,j} = \sum\limits_{j=1}^n \sum\limits_{k = 1 \\ k \neq j}^n \det\big(u_1, \dots, \partial_j u_k, \dots, u_n\big)$$

now we exchange the order of summation and the $j$th and $k$th column (we exchange $e_i$ and $\partial_j u_k$; the determinant is alternating so this leads to a change of sign)

$$= -\sum\limits_{k=1}^n \sum\limits_{j = 1 \\ j \neq k}^n \det\big(u_1, \dots, \partial_j u_k, \dots, u_n\big)$$

Now, since $\partial_j u_k = \partial_k u_j$ this leads us to

$$= -\sum\limits_{k=1}^n \sum\limits_{j = 1 \\ j \neq k}^n \det\big(u_1, \dots, \partial_k u_j, \dots, u_n\big) = - \sum\limits_{k=1}^n \frac{\partial}{\partial x_k} (\text{cof}(Df))_{i,k}$$

Thus $(1)$ is proven (except you find mistakes; then please let me know)

I also found an alternative proof using differential forms here.