[Math] Expectation in measure theory

measure-theoryprobability theoryself-learning

I'm reading a book on measure-theoretic probability, and the author defines the expectation of a random variable $X$ on a probability space $(\Omega,\scr H,\mathbb{P})$ as $\int_\Omega Xd\mathbb{P}$, but I'm trying to reconcile this with the definition of expectation found in calc-based probability books $\int_{-\infty}^\infty xf_X(x)dx$, where there's an extra $x$ term in the integrand. My book doesn't do this explicitly.

$$\begin{array}
A(\Omega,\scr H,\mathbb{P}) & \stackrel{X}{\longrightarrow} & (\mathbb{R},\scr B(\mathbb{R})) \\
\ & & \downarrow{Y(x)=x} \\
& \ & (\mathbb{R},\scr B(\mathbb{R}))
\end{array}
$$

Let $P=\mathbb{P}\circ X^{-1}$ be the distribution of $X$. Since $Y$ is a random variable, we can define the exectation of $\mathbb{E}(Y)$. But since $P(A)=P\circ Y^{-1}(A), \forall A\in\scr B(\mathbb{R})$, $X$ and $Y$ have the same distribution. Hence $\mathbb{E}(X)=\mathbb{E}(Y)$, i.e. $\int_\Omega X(\omega)\mathbb{P}(d\omega)=\int_\mathbb{R}Y(x)P(dx)=\int_\mathbb{R}xdP$. I know the Lebesgue measure $\lambda$ on $(\mathbb{R},\scr B(\mathbb{R}))$ is $\sigma$-finite, and I'm assuming, though I'm not sure, that $P$ is absolutely continuous with respect to the Lebesgue measure, and so by the Radon-Nikodym theorem, $X$ (and $Y$) both admit a density function and $\int_\mathbb{R}xdP=\int_\mathbb{R}x\frac{dP}{d\lambda}d\lambda=\int_\mathbb{R}xf_Xd\lambda=\int_\mathbb{R}xf_Yd\lambda.$

Is this right?

Best Answer

For the probability space $(\Omega, \mathscr F, \mathbb P)$, the random variable $X$ with distribution $F_X$ has Skorokhod representation

$$X(\omega) := \sup \{x \in \mathbb R | F_X(x) < \omega\}$$

It can be shown that in the probability space $(\mathbb R, \mathscr B (\mathbb R), \mathcal L_X)$, the identity random variable

$$Y(\omega) = \omega$$

has distribution $F_Y = F_X$.

Let's take the exponential distribution:

In $((0,1), \mathscr B(0,1), Leb)$, it looks like:

$$X(\omega) = \frac{1}{\lambda}\ln(\frac{1}{1-\omega})$$

or

$$X(1-\omega) = \frac{1}{\lambda}\ln(\frac{1}{\omega})$$

Generally, in $(\Omega, \mathscr F, \mathbb P)$, it looks like:

$$X(\omega) := \sup \{x \in \mathbb R | F_X(x) < \omega\}$$

Specifically, in $(\mathbb R, \mathscr B (\mathbb R), 1-e^{-\lambda x})$, it looks like:

$$X(t) := t$$

The expectations for each are

$$\lambda^{-1} = \int_0^1 \frac{1}{\lambda}\ln(\frac{1}{1-\omega}) d\omega = \int_0^1 \frac{1}{\lambda}\ln(\frac{1}{\omega}) d\omega$$

$$\lambda^{-1} = \int_{\Omega} X d \mathbb P = \int_{\Omega} \sup \{x \in \mathbb R | F_X(x) < \omega\} d \mathbb P(\omega)$$

$$\lambda^{-1} = \int_{\mathbb R} t dF_X(t) = \int_{\mathbb R} t d(1-e^{-\lambda t}) = \int_{\mathbb R} t f_X(t) dt = \int_{\mathbb R} t \lambda e^{-\lambda t} dt$$