Proving property of continuous random variables

calculusintegrationprobabilityprobability distributionsrandom variables

In proving the following property for a continuous random variable with $X\ge0$: $\int_0^\infty P(X>x)dx=E[X]$ where $E[X]$ is expectation of X

$\int_0^\infty P(X>x)dx=\int_0^\infty 1\times(1-P(X\le x))dx=x(1-F_X(x))|_0^\infty+\int_0^\infty xf_X(x)dx = x(1-F_X(x))|_0^\infty+E[X]$

Where $F_X(x)$ is the distribution function of X and $f_X(x)$ is the pdf of X.

Question

How do I show that $x(1-F_X(x))|_0^\infty = 0$?
I have tried doing L'Hopital law i.e

$x(1-F_X(x))|_0^\infty=\lim_{n\to\infty}n(1-F_X(n))-0=\lim_{n\to\infty}\dfrac{1-F_X(n)}{\dfrac{1}{n}}=\lim_{n\to\infty}\dfrac{f_X(n)}{n^{-2}}$ by applying L'Hopital in the last step, however im still stuck in showing that this limit goes to $0$.

PS: (I have used the fact that by FTC derivative of distribution function is equal to pdf )

Best Answer

Given that $E(X)=\int_0^\infty x\,dF(x)$ exists, you have $$\lim_{t\to\infty}\int_t^\infty x\,dF(x)= 0$$

Again,

\begin{align} x(1-F(x))&=xP(X>x) \\&=x\int_x^\infty dF(y) \\&\le \int_x^\infty y\,dF(y) \end{align}

So $\lim_{x\to\infty}x(1-F(x))\le 0$ and you already have $x(1-F(x))\ge 0$ for all $x\ge 0$.

Therefore a necessary condition for $E(X)$ to exist is that $$\lim_{x\to \infty} x(1-F(x))=0$$

Related Question