Probability Theory – Moments of a Random Variable in Terms of CDF

expectationprobability theorystatistics

Consider a random variable $X$ with distribution function $F(x)$. Calculate the $r$th moment of $X$, $\mathbb E X^r$. I read that the desired moment can be calculated as follows.

$$
\begin{align}
\mathbb E X^r &= \int_0^\infty x^r dF(x) – \int_0^\infty (-x)^r dF(-x) \\[5pt]
&=r \int_0^\infty x^{r-1} \left[ 1-F(x)+(-1)^rF(-x) \right] dx.
\end{align}
$$

Could anyone explain to me why this is true, please? In addition, when do we need to find the moments in this way? Thank you!

Best Answer

Here's a sketch for the continuous case:

For any nonnegative continuous real random variable $X$ and any integer $r\ge 1$, $$X^r = \int_0^X rx^{r-1}dx = \int_0^\infty r x^{r-1}[X>x]dx $$ where
$$[X>x] = \begin{cases} 1 & \text{if }X>x \\ 0 & \text{if }X\le x. \end{cases}$$ Therefore, using Tonelli's theorem and the fact that $E[X>x] = P(X>x)$, $$E (X^r) = r \int_0^\infty x^{r-1}P(X>x)dx. $$ Now, for any continuous random variable $X$ (not necessarily nonnegative), we have $X = Y - Z$, where $Y=X^+$ and $Z=X^-$ are the positive and negative parts of $X$. Then, since $YZ = 0$, the Binomial Theorem gives $$X^r = (Y - Z)^r = Y^r +(-Z)^r $$ and because both $Y$ and $Z$ are nonnegative random variables, $$\begin{align} E(X^r) &= E(Y^r) + (-1)^rE(Z^r)\\ &=r\int_0^\infty y^{r-1}P(Y>y)dy + (-1)^r r\int_0^\infty z^{r-1}P(Z>z)dz\\ &=r\int_0^\infty y^{r-1}P(X>y)dy + (-1)^r r\int_0^\infty z^{r-1}P(X<-z)dz\\ &=r\int_0^\infty x^{r-1}\big(P(X>x) + (-1)^r P(X<-x)\big)dx\\ &=r\int_0^\infty x^{r-1}\big(1-F(x) + (-1)^r F(-x)\big)dx.\\ \end{align} $$

Related Question