[Math] Characteristic function and moment generating function: differentiating under the integral

characteristic-functionsmoment-generating-functionsprobability theoryreal-analysis

In order to justify the interchange of the derivative and integral when differentiating a characteristic function, one can use the dominated convergence theorem:

$$\frac{d}{dt} \int e^{itx} P(dx) = \lim_{h \to 0} \frac{1}{h} \int (e^{ihx}-1) e^{itx} P(dx).$$

Since $|e^{ihx}-1| \le |hx|$, we have

$$\frac{1}{h} \int |e^{ihx}-1| P(dx) \le \int |x| P(dx),$$
so if we assume the random variable is in $L^1$, we may push the derivative under the integral. Similarly, if the random variable is in $L^k$, then we can push the $k$th derivative under the integral.

I am trying to find an analogous statement for moment generating functions, but I am having trouble generalizing the above argument. Under what conditions can we do this for MGFs? Any hints would be appreciated, but I would prefer an argument that uses dominated convergence rather than Leibniz's integral rule.

Best Answer

Denote by

$$M(t) := \int e^{tx} \, \mathbb{P}(dx)$$

the moment generating function of the measure $\mathbb{P}$.

Suppose that there exist $t_0 \in \mathbb{R}$ and $\epsilon>0$ such that $M(t)<\infty$ for all $t \in [t_0-\epsilon,t_0+\epsilon]$. Then

  1. If $t_0>0$ and $\int_{(-\infty,0)} |x| \, \mathbb{P}(dx)<\infty$, then $M$ is differentiable at $t=t_0$.
  2. If $t_0<0$ and $\int_{(0,\infty)} |x| \, \mathbb{P}(dx) <\infty$, then $M$ is differentiable at $t=t_0$.
  3. If $t_0=0$, then $M$ is differentiable at $t = t_0 = 0$.

Proof:

  1. Choose $\epsilon \in (0,1)$ sufficiently small such that $(t_0-\epsilon,t_0+\epsilon) \subseteq (0,\infty)$ and fix $h \in (-\epsilon/2,\epsilon/2)$. It follows from the mean value theorem that $$\left|\frac{e^{(t_0+h)x} -e^{t_0 x}}{h} \right| \leq |x| e^{\zeta x} \tag{1}$$ for some intermediate value $\zeta \in (t_0, t_0+h) \subseteq (0,\infty)$. If $x \geq 0$, we get $$\left|\frac{e^{(t_0+h)x} -e^{t_0 x}}{h} \right| \leq x e^{(t_0+h)x}.$$ Since $t_0+\epsilon>t_0+h>0$, we can choose $C>0$ (not depending on $h$, $x$) such that $$\left|\frac{e^{(t_0+h)x} -e^{t_0 x}}{h} \right| \leq C e^{(t_0+\epsilon)x} \tag{2}$$ for all $x \geq 0$. For $x \leq 0$, $(1)$ yields $$\left|\frac{e^{(t_0+h)x} -e^{t_0 x}}{h} \right| \leq |x|. \tag{3}$$ Combining $(2)$ and $(3)$, we get $$\left|\frac{e^{(t_0+h)x} -e^{t_0 x}}{h} \right| \leq w(x)$$ for $$w(x) := \begin{cases} C e^{x(t_0+\epsilon)}, & x \geq 0, \\ |x|, & x < 0 \end{cases}$$ Because of our assumptions, $w$ is an integrable dominating function. Applying the dominated convergence theorem proves the differentiability.
  2. Apply statement 1 to the measure $\mathbb{Q}(B) := \mathbb{P}(-B)$.
  3. Choose $h \in (0,\epsilon/2)$. Using $(1)$ for $t_0 = 0$, we get $$\left| \frac{e^{hx}-1}{h} \right| \leq |x| e^{\zeta x}$$ for some intermediate value $\zeta=\zeta(h) \in (0,h)$. Hence, $$\left| \frac{e^{hx}-1}{h} \right| \leq |x| (1_{\{x \leq 0\}} + e^{hx} 1_{\{x>0\}}).$$ Using that $|x| \leq C(e^{-\epsilon x}+e^{\epsilon x})$ for some constant $C$, we get $$\left| \frac{e^{hx}-1}{h} \right| \leq (C+1) e^{x \epsilon} + C e^{-x \epsilon}.$$ Consequently, we may again apply the dominated convergence theorem to interchange limit & integration. A very similar argumentation works for $h \in (-\epsilon/2,0)$. This gives the differentiability of $M$ at $t=0$.
Related Question