Leibniz rule for expected value

convergence-divergenceexpected valueimproper-integralsintegrationleibniz-integral-rule

Let $Z(x)$ be a continuous random variable and let $f(x)$ be a probability density function defined on $: [0,\infty]$. Apply the Leibniz rule to the expected value
$$E[Z(x)] = \int_{0}^\infty Z_i(x) f(x)dx $$

as in to evaluate
$$\frac{dE[Z_i(x)]}{dx} = \int_{0}^\infty Z_i(x) \frac{df(x)}{dx} dx $$

As the integral is improper, following this post I would need assume that $f(x)$ and $\frac{df(x)}{dx}$ are continuous and show that $\int_{0}^\infty df(x) dx$ converges for some $x_0 \in [0,\infty]$ and $\int_{0}^\infty\frac{df(x)}{dx} dx$ converges uniformly for all $x_0 \in [0,\infty]$.

So, firstly, is this reasoning correct? And secondly, any helping points to get me started showing the convergence?

Best Answer

First let's look at the version of Leibniz rule, that you referenced (from this post). The formula is on the form $$ \frac{\mathrm{d}}{\mathrm{d}t}\int_{-\infty}^{\infty} f(x,t) \mathrm{d}x = \int_{-\infty}^{\infty}\frac{\partial}{\partial t}f(x,t) \mathrm{d}x. $$ Notice in particular that the variable $t$ that we are differentiating with respect to is not the same as the variable $x$, which we are integrating with respect to. So in order to formulate a version of the formula for expected values we would need our random variable to depend on the variable $t$ in some way. I will therefore write $Z_t$ instead of $Z(x)$.

Version 1

Since $Z_t$ now depends on $t$ we would expect that the density $f_{Z_t}$ also depends on $t$ so we will write $f_{Z_t}(x) = f(x,t)$. Assuming that this function is differentiable and that Leibniz rule applies, we get $$ \frac{d}{dt}(E[Z_t]) = \frac{d}{dt} \int_0^\infty x f(x,t) \: dx = \int_0^\infty x \frac{\partial}{\partial t}f(x,t) \: dx.$$

Example 1

Suppose that $Z_t$ is an exponentially distributed random variable with rate paratemeter $\lambda = t$. Then $f(x,t) = te^{-tx}$, so

\begin{align*}\frac{d}{dt}(E[Z_t]) &= \int_0^\infty x\frac{\partial}{\partial t}(te^{-tx}) \: dx \\ &=\int_0^\infty xe^{-tx} - x^2te^{-tx} \: dx \\ &= - \frac{1}{t^2} \end{align*} This can easily be verified since $E[Z_t] = \frac{1}{t}$.

Version 2

A common situation is when $Z_t$ depends on another random variable $X$ through a relation $Z_t = g(X,t)$ for a function $g$. Again assuming that $g$ is differentiable and that the Leibniz formula applies we may write $$\frac{d}{dt}(E[g(X,t)]) = \frac{d}{dt} \int_{-\infty}^\infty g(x,t) f_X(x) \: dx = \int_{-\infty}^\infty \frac{\partial}{\partial t} (g(x,t)) f_X(x) \: dx,$$ which can be written more compactly as $$\frac{d}{dt}(E[g(X,t)]) = E[\frac{\partial}{\partial t} g(X,t)]$$

Example 2

In particular assume $Z_t = e^{tX}$ and define $M_X(t) = E[Z_t]$, then $$M_X'(t) = E[ \frac{\partial}{\partial t} e^{tX}] = E[Xe^{tX}]$$ and in particular $M_X'(0) = E[X]$. $M_X$ is called the moment generating function of $X$.

Related Question