I need the integral
$$f(x)=\lim_{\epsilon\rightarrow0}\int_{\epsilon}^{\infty}\frac{\exp(-u)-\exp(-u^{\alpha}x)}{u}~\mathrm{d}u$$
with $\alpha>0$ in (statistical learning) application. It looks a lot like
$$\ln(x)=\lim_{\epsilon\rightarrow0}\int_{\epsilon}^{\infty}\frac{\exp(-u)-\exp(-ux)}{u}~\mathrm{d}u$$
which is just $\alpha=1$.
Using
$$I_{u}(x)= \int_{1}^{x}e^{-u\chi}~\mathrm{d}\chi=\left[-\frac{1}{u}e^{-u\chi}\right]_{1}^{x}=-\frac{e^{-ux}-e^{-u}}{u}=\frac{e^{-u}-e^{-ux}}{u},$$
I can prove the case for $\alpha=1$ by
$$\int_{\epsilon}^{\infty}I_{u}(x)~\mathrm{d}u =\lim_{\epsilon\rightarrow0}\int_{\epsilon}^{\infty}\int_{1}^{x}e^{-u\chi}~\mathrm{d}\chi~\mathrm{d}u=\lim_{\epsilon\rightarrow0}\int_{1}^{x}\int_{\epsilon}^{\infty}e^{-u\chi}~\mathrm{d}u~\mathrm{d}\chi \\
=\lim_{\epsilon\rightarrow0}\int_{1}^{x}\left[-\frac{1}{\chi}e^{-u\chi}\right]_{\epsilon}^{\infty}~\mathrm{d}\chi=\lim_{\epsilon\rightarrow0}\int_{1}^{x}\left(\frac{1}{\chi}e^{-\epsilon\chi}\right)~\mathrm{d}\chi\\
=\lim_{\epsilon\rightarrow0}\int_{1}^{x}\left(\frac{1}{\chi}e^{-\epsilon\chi}\right)~\mathrm{d}\chi=\int_{1}^{x}\left(\lim_{\epsilon\rightarrow0}\frac{1}{\chi}e^{-\epsilon\chi}\right)~\mathrm{d}\chi=\int_{1}^{x}\left(\frac{1}{\chi}\right)~\mathrm{d}\chi=\ln x$$
but how does that transfer?
Best Answer
$f'(x)=\frac{\text{d} f(x)}{\text{d}x}=\int_{0}^{\infty} \frac{\partial }{\partial x}\left(\frac{e^{-u}-e^{-u^{\alpha}x}}{u}\right) du$
Taking, $z=u^{\alpha}x$ we get
$=\frac{x^{-1}}{\alpha}(\int_{0}^{\infty} e^{-z} dz) =\frac{x^{-1}}{\alpha}$.....(1)
Taking, $f(x)=y$
So, from (1) we get the differential equation
$\frac{\text{d} y}{\text{d}x}=\frac{x^{-1}}{\alpha}$ with the boundary condition $y(1)=\int_{0}^{\infty} \frac{e^{-u}-e^{-u^{\alpha}}}{u} du =-\frac{(\alpha-1)}{\alpha}\gamma$
And for, all $\alpha\geq 1$, $y(1)=\frac{-(\alpha-1)\gamma}{\alpha}$.
Where, $\gamma$ is Euler- Mascheroni constant.
So, $f(x)=\frac{\text{ln}x}{\alpha} -\frac{(\alpha-1)\gamma}{\alpha}$.
So, for $\alpha=1$, $f(x)=\text{ln}x$.