Let $f:\mathbb{R}^2\rightarrow\mathbb{R}^2$ such that at each point in the plane, if $D$ is the derivative matrix (Jacobian) of $f$, then $D^TD=\lambda I$ for some $\lambda$ and $det(D)≥0$. Notice that if we imagine $f$ instead as a function from $\mathbb{C}\rightarrow\mathbb{C}$ by identifying $f(a,b)$ with $f(a+bi)$, then this condition is equivalent to asking for $f$ to be complex differentiable. Recall that if a function is holomorphic on its entire domain, then its analytic, and thus all higher order derivatives exist. Thus if this condition on the derivative matrix $D$ holds, then all higher derivatives of $f$ exist.
I would like to know if there exist generalizations of this fact to higher dimensions. For example, say $f:\mathbb{R}^n\rightarrow\mathbb{R}^n$ such that at each point $\mathbb{R}^n$, if $D$ is the derivative matrix (Jacobian) of $f$, then $D^TD=\lambda I$ for some $\lambda$ and $det(D)≥0$. Must all higher derivatives of $f$ exist in this case? If not, is there a different condition we can place on $D$ so that all higher derivatives of $f$ must exist?
In the case of looking for stronger conditions on $D$, of course we could pick conditions that trivially make all higher derivatives exist, for example, the condition that $D=I$ or $D=0$, however I would like to know about non trivial conditions if possible.
Best Answer
Here is the basic definition of Frechet-differentiability in Banach spaces.
This definition encompasses all the possible cases. For example, we can take $V=\Bbb{C}^n,W=\Bbb{C}^m$ and we have a notion of holomorphicity in this case. Note that in the case where $V=\Bbb{C}$ ($W$ can be any complex Banach space, for instance $\Bbb{C}$), this is equivalent to the existence of the limit $f'(a):=\lim\limits_{h\to 0}\frac{f(a+h)-f(a)}{h}$. In this case, the relationship between the two notions is that $f'(a)=Df_a(1)$.
Let us record the following almost trivial theorem.
This should be clear from the definition, because the only difference between the real and complex case is whether the linearity being considered is over $\Bbb{R}$ or over $\Bbb{C}$. Now, in your question, you're asking about a function defined over the reals, and asking about its holomorphicity. To be precise, we should first specify in what way we're identifying $\Bbb{R}^{2n}$ with a complex vector space like $\Bbb{C}^n$. For this reason, we note the following definition.
Here are some simple facts:
So, let us now take $V=\Bbb{R}^{2n}$ and $W=\Bbb{R}^{2m}$. Fix complex structures $J$ on $V$ and $J'$ on $W$, so that we can consider $V,W$ as either real or complex Banach spaces. Let $A\subset V$ be open and $a\in A$ and $f:A\to W$ a given mapping. Then, by the trivial theorem above, we have that $f$ is holomorphic at $a$ if and only if $f$ is real-differentiable at $a$, and $Df_a$ is a complex-linear transformation. But now, being complex-linear is equivalent to the condition that $J'\circ Df_a= Df_a\circ J$ (i.e we can "pull out" scalar multiplication by $i$).
If we consider different complex structures, then the effect is to replace $f$ by an appropriate composition on the domain and target by real-linear isomorphisms.
For example, suppose $J:\Bbb{R}^{2n}\to\Bbb{R}^{2n}$ is defined by \begin{align} J(x_1,\dots, x_n,y_1,\dots, y_n):= (-y_1,\dots, -y_n,x_1,\dots, x_n), \end{align} and $J'$ defined similarly on $\Bbb{R}^{2m}$. Then, $f$ is holomorphic at $a$ (wrt complex vector space structure defined by $J,J'$) if and only if $f$ is real-differentiable at $a$ and the matrix representation (relative to standard ordered bases of $\Bbb{R}^{2n}$ and $\Bbb{R}^{2m}$) satisfies \begin{align} \begin{pmatrix} 0&-I_m\\ I_m&0 \end{pmatrix}\cdot [Df_a] &= [Df_a]\cdot \begin{pmatrix} 0&-I_n\\ I_n&0 \end{pmatrix}. \end{align}
If on the other hand we consider the complex structure defined by $M:\Bbb{R}^{2n}\to\Bbb{R}^{2n}$, \begin{align} M(x_1,y_1,\dots, x_n,y_n):= (-y_1,x_1,\dots, -y_n,x_n), \end{align} and likewise defining $M'$ on $\Bbb{R}^{2m}$, then $f$ is holomorphic at $a$ (wrt the complex vector space structure defined by $M,M'$) if and only if $f$ is real-differentiable at $a$ and \begin{align} \underbrace{\begin{pmatrix} J_1 & & \\ & \ddots &\\ & & J_1 \end{pmatrix}}_{\text{$m$ times }} \cdot [Df_a] &= [Df_a]\cdot \underbrace{\begin{pmatrix} J_1 & & \\ & \ddots &\\ & & J_1 \end{pmatrix}}_{\text{$n$ times}}, \end{align} where $J_1=\begin{pmatrix}0&-1\\1&0\end{pmatrix}$.
Now, having established the definitions, note that if we instead start with a function $f:A\subset \Bbb{C}^n\to\Bbb{C}^m$ then $f$ is holomorphic at $a$ if and only if each of its component functions $f_1,\dots, f_m$ is holomorphic at $a$ (according to the first definition). Thus, in what follows, we may as well assume $m=1$. The conditions above boil down to the appropriate Cauchy-Riemann equations \begin{align} \frac{\partial f}{\partial x_k}(a)+i\frac{\partial f}{\partial y_k}(a)&=0 \qquad \left(1\leq k \leq n\right). \end{align} (the different matrices arose above simply because of the manner in which we identified $\Bbb{R}^{2n}$ with $\Bbb{C}^n$).
So, now we can try to prove holomorphic on $A$ implies analytic on $A$ by mimicking the one-dimensional proof; use a generalized form of Cauchy's integral formula to write $f$ as a power series in $n$-variables (the coefficients being given by appropriate integrals). From here, several of the basic results can be generalized simply by going over the proofs in the one-dimensional case and modifying them appropriately.