Does Analytical Continuation of Series Have a Branch Cut if Function Has a Pole? – Analytic Continuation

analytic-continuationdivergent-seriesintegration

I suspect the answer to the title question is 'no', but I'm hoping to find an explicit counterexample. Also, I am requiring that $\sum f(n) x^n $ has a finite radius of convergence, otherwise, the question would be trivial by picking a holomorphic function.

My thoughts

The theorem is at least plausible, since $\sum_{n=1}^\infty \frac{1}{n^k} x^n$ has a branch cut for any $k \in \mathbb{N}$. Since the Lerch Transcendent also has branch cuts, we also have that $\sum_{n=1}^\infty \frac{1}{(n+a)^k}x^n$ results in a branch cut. This suggests that meromorphic functions $f(n)$ might create branch cuts, since we could write them as the sum of a holomorphic function plus terms like $\sum_{j} \frac{a_j}{(z+b_j)^{k_j}}$ each of which perhaps contributes to a branch cut.


This question is born out of the following very non-rigorous method I have been using to obtain the values of branch cuts. To illustrate the method take the simple series $F(x)=\sum_{n=1}^\infty \frac{(-1)^n }{n} (x-1)^n = –
\ln(x)$
. We can represent this sum as a contour integral, specifically,
$$F(x) = \int_{1/2- i \infty}^{1/2 + i \infty} \frac{\csc(\pi z)}{2i} \frac{(x-1)^z}{z}dz$$
Now, when the original series converges, we close this contour towards the right (in the case that $0<x<1$, we technically need to infinitesimally rotate the contour towards the right to make it converge, but this is not all that important). However, at exactly $x=0$ the integrand no longer converges on the right half plane and instead converges on the left half plane. For instance, the left image shows when $x>0$ in which case the function becomes small on the right half plane, and when $x<0$ (the right image) then the function becomes small on the left half plane

Therefore, at $x<0$ we are forced to switch the direction of the contour so that it closes to the left. If there are no residues besides those created by $\csc(\pi z)$, then when we are forced to switch the direction of the contour it shouldn't have an effect on the sum (see here: What is the relationship between $\sum_{n=0}^\infty f(n) x^n$ and $-\sum_{n=1}^\infty f(-n) x^{-n}$?). However, in this case, we have the extra residue created by the pole of $\frac{1}{z}$. This residue adds a value of $\pm \pi i$, depending on how we choose to represent $(-1)^z$ (i.e. as $e^{\pi i z}$ or $e^{-\pi i z}$). Therefore, when we are forced to switch the direction of the contour we are forced to pick up an extra residue, and this residue is what causes the branch cut. If this reasoning extends generally, then we would expect that whenever $f(n)$ has a pole and the radius of convergence of $\sum f(n) x^n$ is finite, then we will have a branch cut. This argument would suggest that when $\sum f(n) x^n$ is evaluated outside its radius of convergence, we need to switch the direction of the contour, and this swap will pick up the extra pole of $f(n)$ which will create a branch cut. Though, to reiterate, I don't actually expect this to be true in general. I think this argument breaks down under certain circumstances, so I'd like to find instances where it fails in order to understand how it breaks down.

Best Answer

A counterexample is given by $f(z)=\frac{1}{(z+1)\Gamma(-z)}+1$. This function is meromorphic (with the only pole at $z=-1$) and equals $1$ at all positive integers, hence $\sum_{n=1}^\infty f(n)x^n=\frac{x}{1-x}.$

Related Question