Asymptotics of Integral for Large x – Real Analysis

asymptoticsca.classical-analysis-and-odesreal-analysis

I'm interested in the asymptotics of
$$\int_0^\infty \frac{x^{2z}}{\Gamma(1+z)}\,dz$$
as $x\to\infty$. I expect the results to behave similarly to $e^{x^2}=\sum_{k\ge 0}\frac{x^{2k}}{k!}$. However, I'm not quite sure how to develop the leading asymptotics of the integrals. I first thought that for large $x$, the integral should be dominated by its integral over a small ball around the maximum of the integrand $\frac{e^{2z\log x}}{\Gamma(1+z)}$.

To find this maximum, I computed
$$f'(z)=\left(2\log(x)-\psi^{(0)}(1+z)\right)f(z)$$
with $\psi^{(0)}(z)$ the logarithmic derivative of $\Gamma(z)$. Since $f$ never vanishes, the maximum must occur at $z\in(0,\infty)$ such that $\psi^{(0)}(1+z)=2\log x$. Assuming that we will take $x\to\infty$ as well as the asymptotics $\psi(z)=\log z +O(1/z)$ for large and positive $z$, we seek to solve $2\log x= \log(1+z) +O(1/z)$. Exponentiating, we find that $x^2=1+z+O(1)$. Thus, we have that $\operatorname{arg max} f(z)\sim x^2$ is asymptotically correct for large $x$. Let $z_0=x^2$. We can now rewrite the integral as dominated by
$$\frac{e^{2x^2\log x}}{\Gamma(1+x^2)}\int_{z_0-\epsilon}^{z_0+\epsilon} e^{2(z-z_0)\log x} \frac{\Gamma(1+z_0)}{\Gamma(1+z)}\,dz.$$
However, I'm not sure what size to take $\epsilon$ as a function of $x$. I do know that the expression outside of the integral is asymptotic to
$$(2\pi)^{-1/2} \frac{e^{x^2}}{x}.$$
I'm not sure how to deal with the actual integral though. Input is much appreciated.

Edit: Math Stack Exchange cross-post

Best Answer

$\newcommand\Ga\Gamma$Let $g(x)$ denote your integral. Then $$g(x)\sim e^{x^2} \tag{1}\label{1} $$ as $x\to\infty$.

Proof:
$$g(x)=\int_0^\infty dz\,\frac{x^{2z}}{\Ga(1+z)}. \tag{1.5}\label{1.5} $$ So, for $x>0$, $$g'(x)=\int_0^\infty dz\,\frac{2zx^{2z-1}}{\Ga(1+z)} =\int_0^\infty dz\,\frac{2x^{2z-1}}{\Ga(z)} \\ =2\int_{-1}^\infty dt\,\frac{x^{2t+1}}{\Ga(1+t)} =2x\int_{-1}^\infty dz\,\frac{x^{2z}}{\Ga(1+z)}.$$ So (and this is the key), $g$ is a solution of the ODE $$g'(x)=2xg(x)+h(x),$$ where $$h(x):=2x\int_{-1}^0 dz\,\frac{x^{2z}}{\Ga(1+z)}.$$ Also, $g(0)=0$. So, $$g(x)=e^{x^2}G(x), \tag{2}\label{2} $$ where $$G(x):=\int_0^x du\, e^{-u^2}h(u). \tag{3}\label{3} $$ As $x\to\infty$, $$G(x)\to\int_0^\infty du\, e^{-u^2}h(u) =\int_0^\infty du\, e^{-u^2}2u\int_{-1}^0 dz\,\frac{u^{2z}}{\Ga(1+z)} \\ =\int_0^\infty dv\, e^{-v}\int_{-1}^0 dz\,\frac{v^z}{\Ga(1+z)} =\int_{-1}^0 \frac{dz}{\Ga(1+z)}\int_0^\infty dv\, v^z e^{-v} \\ =\int_{-1}^0 \frac{dz}{\Ga(1+z)}\,\Ga(1+z)=1. $$ Now \eqref{1} follows from \eqref{2}. $\quad\Box$

Remark: Note that $-\Ga'(1)$ is Euler's $\gamma>0$. Therefore and because $\Ga$ is log convex on $(0,\infty)$, we see that $\Ga$ is decreasing on $(0,1]$. So, for $x\ge1$ we have $$h(x)<\frac{2x}{\Ga(1)}\int_{-1}^0 dz\,x^{2z}<2x.$$ So, by \eqref{2}--\eqref{3}, for $x\ge1$ the relative error of the asymptotic approximation \eqref{1} is $$0<R(x):=\int_x^\infty du\, e^{-u^2}h(u) <\int_x^\infty du\, e^{-u^2}2u=e^{-x^2},$$ which goes to $0$ very fast as $x\to\infty$.

This bound on the relative error seems hard (if at all possible) to get by the Laplace method/Watson lemma applied to the original integral expression \eqref{1.5} of $g(x)$, which gives an asymptotic expansion in integral powers of $x$.

However, one can apply the Watson lemma to the expression \eqref{2} of $g(x)$ to get the following asymptotic expansion of the absolute error of approximation \eqref{1}, in integral powers of $\ln x$: $$E(x):=e^{x^2}-g(x) =e^{x^2}\int_x^\infty du\, e^{-u^2}h(u) \sim\sum_{k\ge0}\frac{(-1)^k c_k}{(2\ln x)^{k+1}} \tag{4}\label{4}$$ as $x\to\infty$, where $c_k:=\frac{d^k}{dz^k}\frac1{\Ga(1+z)}\big|_{z=0}$, so that $c_0=1$. $\quad\Box$


For an illustration, below are the graphs $\{(x,g(x)/e^{x^2})\colon0\le x\le 4\}$ (solid black), $\{(x,E(x)/E_0(x))\colon5\le x\le100\}$ (solid red), $\{(x,E(x)/E_1(x))\colon5\le x\le100\}$ (solid green), and $\{(x,E(x)/E_2(x))\colon5\le x\le100\}$ (solid blue), where (cf. \eqref{4}) $E_m(x):=\sum_{k=0}^m\frac{(-1)^k c_k}{(2\ln x)^{k+1}}$:

enter image description here

enter image description here

Related Question