On different integral representations of $(1+x^2)^{-1/2}$ via Bessel functions

bessel functionsdefinite integralsfourier analysislaplace transformspecial functions

The function from the title of this question has (at least) two different integral representations in terms of Bessel function. First, we have a Fourier expansion$$\frac{1}{\sqrt{1+x^2}}= \frac{2}{\pi} \int_0^\infty K_0(k) \cos(kx) dk,$$with $K_0$ being the modified Bessel function of second kind. Second, we have a Laplace expansion$$\frac{1}{\sqrt{1+x^2}}= \int_0^\infty J_0(k) \exp(-k|x|) dk,$$where $J_0$ is the Bessel function of the first kind.

I am very comfortable with the first (Fourier) expansion, both sides of the equation look smooth at $x=0$ and all is good. On the other hand, the second (Laplace) expansion seems to imply non-differentiability at $x=0$, contrary to the LHS of the equation.

My question: Is the second expansion, in fact, valid, and, if so, how to interpret the different behavior at $x=0$ on both sides of the equation?

EDIT: It seems that the essence of my question has nothing to do with Bessel functions, rather, it is a question of taking the derivative of a Laplace transform. Is it possible to differentiate under the integral sign, possibly at the cost of introducing additional terms of distributional nature (delta functions) that take care of possible singularities at $x=0$?

Best Answer

You've got two questions about the absolute Laplace transform, which is defined as $$\mathcal L_a[f(k)](x)=\int^\infty_{0}f(k)e^{-k|x|}dk$$ for continuous $f(k) \in o(e^{\delta x}), \forall\delta>0$.

Question 1: How to interpret the different behaviour on both sides of the equation under differentiation at $x=0$?

In $\mathbb R$, if there exists a punctured neighbourhood of $x=0$ such that $\mathcal L_a[f](x)$ and $\displaystyle{\sum^\infty_{n=0}a_n x^{2n}}$ coincide , then both expressions have the same behaviour under differentiation at $x=0$, in the sense that $$\int^\infty_{0}f(k) \left( \frac{\partial}{\partial x} \right)_{\text{left}} e^{-k|x|}dk\bigg\vert_{x\to0^-} =\int^\infty_{0}f(k) \left( \frac{\partial}{\partial x} \right)_{\text{right}} e^{-k|x|}dk\bigg\vert_{x\to0^+} =\frac{d}{dx}\sum^\infty_{n=0}a_n x^{2n}\bigg\vert_{x=0}=0 $$ where the subscripts denote the one-sided derivatives.

Sketch of proof:

The third equality is trivial.

To prove the right-sided derivative is zero, we want to switch the order of integral and differentiation. Here we will utilise the 'extended Leibniz integral rule':

For $$\frac{d}{dx}\int^\infty_c f(x,t)dt=\int^\infty_c \frac{\partial}{\partial x} f(x,t)dt\qquad{x\in(a,b)}$$ to be true, the sufficient conditions are

  1. $f(x,t)$ and $\displaystyle{\frac{\partial}{\partial x} f(x,t)}$ are continuous in the region $c\le t<\infty$, $a\le x\le b$.

  2. $\displaystyle{\lim_{N\to\infty}\int^N_c \frac{\partial}{\partial x}f(x,t)dt} $ converges uniformly for $x\in(a,b)$.

  3. $\displaystyle{\lim_{N\to\infty}\int^N_c f(x,t)dt} $ converges for $x\in(a,b)$.

It is straight-forward to prove that the three conditions are satisfied for $0<x<r$ ($r$ is the radius of convergence of the Taylor series). Thus $$\begin{align} \int^\infty_{0}f(k) \left( \frac{\partial}{\partial x} \right)_{\text{right}} e^{-k|x|}dk\bigg\vert_{x\to0^+} &=\left( \frac{d}{dx} \right)_{\text{right}}\int^\infty_{0}f(k)e^{-kx}dk\bigg\vert_{x\to0^+} \\ &=\left( \frac{d}{dx} \right)_{\text{right}}\sum^\infty_{n=0}a_n x^{2n}\bigg\vert_{x\to0^+} \\ &=0 \end{align} $$

Similarly, the left-sided derivative is also zero.

Note: It is a little bit more complicated to show condition 2 is satisfied.

We aim to prove that for $x>0$, $$\lim_{N\to\infty}\int^N_{0}f(k) \left( \frac{\partial}{\partial x} \right)_{\text{right}} e^{-k|x|}dk=\lim_{N\to \infty}\int^N_{0}-kf(k) e^{-kx}dk\quad\text{converges uniformly.}$$

To this end, we make use of Cauchy criterion:

for sufficiently large $m>n>N$, $$\begin{align} \left|\int^m_{n}-kf(k) e^{-kx}dk\right| &<\int^m_{n}\left|kf(k) e^{-kx}\right|dk \\ &<\int^m_{n}e^{\delta x} e^{-kx}dk \\ &<2\cdot\frac{e^{(\delta-x)n}}{x-\delta} \\ &<2\cdot\frac{e^{(\delta-x)N}}{x-\delta} \\ &<2\cdot\frac{e^{-\Delta N}}{\Delta} \quad \text{for } x>\delta+\Delta, \Delta>0\\ \end{align} $$

Choosing $N=\frac1{\Delta}\ln\frac 2{\epsilon\Delta}$ would show uniform convergence for $x>\delta+\Delta$, and hence justifying the exchange of differentiation and integral for $x>\delta+\Delta$. By noticing that $\delta,\Delta$ can be chosen to be arbitrarily small, we have shown that condition 2 is satisfied for all $x>0$.


Question 2: Is it possible to differentiate under the integral sign, possibly at the cost of introducing additional terms of distributional nature?

Yes.

Suppose $\mathcal L_a[f](x)$ and $\displaystyle{\sum^\infty_{n=0}a_n x^{2n}}$ coincide in a punctured neighbourhood of $x=0$.

Then, indeed, in the sense of distribution, $$\int^\infty_0 kf(k)dk=0$$ and thus $$\frac{d}{dx}\mathcal L_a[f(k)](x)\bigg\vert_{x=0}=-\text{sgn}(x)\int^\infty_0 kf(k)e^{-k|x|}dk\bigg\vert_{x=0}=-\text{sgn}(0)\int^\infty_0 kf(k)dk=0$$

Proof:

It is well-known that $$\int^\infty_0 \delta'(s)e^{-sk}ds=k$$

Therefore, $$\begin{align} \int^\infty_0 kf(k)dk &=\int^\infty_0 \int^\infty_0 \delta'(s)e^{-sk} f(k) \, ds \, dk \\ &=\int^\infty_0 \int^\infty_0 \delta'(s)e^{-sk} f(k) \, dk \, ds \qquad (1)\\ &=\int^\infty_0 \delta'(s)\left(\int^\infty_0 f(k)e^{-sk} dk\right)ds \\ &=\int^\infty_0 \delta'(s)\sum^\infty_{n=0}a_n s^{2n} ds \qquad (2)\\ &=-\int^\infty_0 \delta(s)\left(\sum^\infty_{n=0}a_n s^{2n}\right)' ds \\ &=-\left(\sum^\infty_{n=0}a_n s^{2n}\right)'_{s=0} \\ &=0 \end{align} $$ $(1)$: Changing order of integrals is justified by Fubini's theorem.

$(2)$: Due to the formula $\displaystyle{\int^\infty_{-\infty}\delta'(x)\varphi(x)dx=-\int^\infty_{-\infty}\delta(x)\varphi'(x)dx}$.