[Math] Show that satisfaction of Cauchy-Riemann Equations in polar coordinates implies analyticity

complex-analysis

Suppose that $U(r,\theta),V(r, \theta)$ are continuously differentiable functions on some polar rectangle $R = \{(r, \theta) \colon r \in (a,b), \theta \in (\theta_1, \theta_2) \} \subseteq \mathbb{R}^2.$ Furthermore, assume that $U$ and $V$ satisfy the polar Cauchy-Riemann equations in $R$:

$$rU_r = V_\theta, U_\theta = -rV_r.$$

If we now view $R$ as a subset of $\mathbb{C}$ rather than $\mathbb{R}^2$, we can define the function $f : R \to \mathbb{C}$ by $f(re^{i\theta}) = U(r, \theta) + iV(r,\theta).$ Prove that $f$ is analytic on $R$.

I am linking this problem to a previous post: Proof of Cauchy Riemann Equations in Polar Coordinates. I believe I am asking a similar question. However, to my best knowledge, the answers to the linked post actually establish the converse of my statement above. That is, they show that analyticity of $f$ implies that these polar Cauchy-Riemann equations are satisfied.

Here's what I have so far: I do know that a function $f(x + iy) = U(x,y) + iV(x,y)$ is analytic when its real and imaginary parts are continuously differentiable and satisfy the rectangular Cauchy-Riemann equations $U_x = V_y, U_y = -V_x$. The proof I have seen of this fact comes from Stein, and the key to the argument is to expand $U$ and $V$ via Taylor's formula for $C^1$ functions. That is, for a point $(x_0, y_0) \in \mathbb{R}^2$, we can write:

$$U(x,y) = U(x_0,y_0) + U_x(x_0,y_0)(x – x_0) + U_y(x_0, y_0)(y – y_0) + R(x,y),$$

and a similar formula for $V(x,y)$. Here, $R(x,y)$ is a remainder term with $\frac{R(x,y)}{|(x,y) – (x_0,y_0)|} \to 0$ as $(x,y) \to (x_0,y_0)$. I'm wondering if there is some way I can adapt this proof from Stein to the polar case?

Hints are solutions are greatly appreciated.

Best Answer

Here's the Taylor expansion approach. First, we need a linear algebra fact:

Any $\mathbb R$-linear map $T:\mathbb C\to\mathbb C$ can be written as $Tz=\alpha z+\beta\bar z$ for some (unique) $\alpha,\beta\in \mathbb C$.

Uniqueness should be clear, and existence follows by first writing $T(x+iy)=\gamma x+\delta y$ with complex $\gamma,\delta$ and then replacing $x=(z+\bar z)/2$, $y=(z-\bar z)/(2i)$.

This linear algebra fact is worth remembering, as it simplifies various computations in complex analysis.

Back to the problem. The first-order real Taylor expansion of $f$ at a point $z\in\mathbb C$ takes the form $$ f(z+h)=f(z)+\alpha h+\beta \bar h+o(|h|) $$ and we want to show that $\beta=0$ here. The equations you are given say that $r f_r + i f_\theta = 0$. Write $h=r e^{i\theta}$, and take the derivatives of $\alpha h+\beta \bar h$: $$ \begin{align} f_r &= \alpha e^{i\theta} + \beta e^{-i\theta} \\ f_\theta & = \alpha ir e^{i\theta} - \beta i r e^{-i\theta} \\ rf_r+f_\theta & = 2\beta r e^{-i\theta} \end{align} $$ Thus $\beta=0$ as desired.

Related Question