Definite integral of Gaussian times $e^{-e^{-x}}$

integration

I had the following problem

$$
\int \limits_{- \infty}^\infty \mathrm{d} x \exp \left( – \frac{x^2}{2 \sigma} – 2 x \right) \exp \left( – E e^{-x} \right), \quad \sigma, E > 0
$$

by substituting $t = x – \log E$ I got rid of constants in the double exponential and arrived at this form (times some constant, which is not important right now)
$$
\int \limits_{- \infty}^\infty \mathrm{d} t \exp \left( – \alpha t^2 – \beta t – e^{-t} \right)
$$

I can't move from here on. I also tried $e^{-t} = u$ ($t = – \log u$)
$$
\int \limits_{- \infty}^\infty \frac{\mathrm{d} u}{u} \exp \left( – \alpha \log^2 u + \beta \log u – u \right) = \int \limits_{- \infty}^\infty \mathrm{d} u \exp \left( – \alpha \log^2 u + (\beta-1) \log u – u \right)
$$

I tried to divide the integral as $e^{-u}$ being one part and the rest the other part but both paths of the by parts integration yield horrible things (one yields the error function, the other gives an even more complicated function).

Is there a substitution that would reveal that this integral depends only on some combination of constants, thus it's a one parameter problem, or is this truly a two parameter problem?

It being a one parameter problem would be neat, but if there's a substitution that would make it into a known function of some combination of parameters times numerical constant given by some integral (similarly how gaussian integral is $1/\sqrt{\alpha}$ times $\sqrt{\pi}$), that would be pretty rad.

I also tried the following expansion
$$
e^{-u} = \sum_{n = 0}^\infty \frac{(-1)^n u^n}{n!}
$$

and therefore
$$
\int \limits_0^\infty \mathrm d u \exp \left( – \alpha \log^2 u + (\beta-1) \log u – u \right) "=" \sqrt{\frac{\pi}{\alpha}} \sum_{n = 0}^\infty \frac{(-1)^n}{n!} \exp \left( \frac{(\beta + n)^2}{4 \alpha} \right)
$$

but the right-hand side obviously does not converge :/ There is also perhaps a way to expand $\exp \left( – e^{-t} \right)$ (from the previous form of the integral) into the infinite series and go from there (and hope that the resulting series will converge after integration), but I don't know how to do that.

I guess at this point it's more or less clear that there is no combination of special and elementary functions that would describe the result of this integration, but even then I'd be interested in asymptotic as $\beta \to \infty$, while $\alpha$ is held constant. I don't know how to approach such task.

Best Answer

The asymptotics for the function $$ I(a,b)=\int_{-\infty}^\infty \exp(-(a\,t^2+b\,t-e^{-t})) dt $$ can easily be handled by the classic saddle point method, for $a>0, b>0.$ Define $$h(t)=a\,t^2+b\,t-e^{-t} \quad \text{so} \quad h'(t)=2a\,t+b-e^{-t} $$ The saddle point is where $h'(t_0)=0,$ which can be explicitly solved in terms of the Lambert W function, $$ (1) \quad t_0=-\frac{b}{2a}+W \quad \text{where} \quad W=\text{ProductLog}[0,\frac{e^{b/(2a)}}{2a} ] $$ in Mathematica notation. Expand $h(t)$ around the saddle point, i.e., $ h(t) = h_0 + h_2(t-t_0)^2 $ $$ \text{in particular,} \quad (2) \quad h_0 = \exp{(-t_0)}-\frac{b^2}{4a}+a\,W^2 \quad \text{ and } h_2=a+ \exp{(-t_0)}/2$$ It is easily shown that $$ \int_{-\infty}^\infty \exp{(-(h_0+ h_2(t-t_0)^2)} dt = e^{-h_0}\sqrt{\frac{\pi}{h_2}} $$ Thus the approximation to I(a,b) is the previous formula with the particular values of the parameters as found in (1) and (2). There are technicalities to get a rigorous proof, like showing that the additional terms beyond the quadratic expansion only contribute a small amount. I was lazy and just checked it numerically, and get good agreement (~1%) for modest $a$ and $b$. This means the formula has a uniform characteristic, not just valid for large $b.$ If you have a large $b$ it might be useful to use the known asymptotics of the Lambert W formula,

$$ W(e^x) \sim x-(1-1/x)\log{x}, \quad x=\exp{\big(b/(2a)-\log(2a)\big)}$$

Related Question