Probability Theory – Expectation of a Mixed Random Variable Given Only the CDF

expected valueprobabilityprobability theory

I'm interested in the following question:

Given only the cumulative distribution function $F(x)$ of a mixed random variable $X$, how does one proceed to calculate the expectation $E(X)$?

By mixed I mean a variable which is neither continuous nor discrete. For example, the cdf could be:$$F(x)=\begin{cases}0&,x\in(-\infty,-1)\\
\frac13+\frac x3&,x\in [-1,0)\\
\frac12+\frac x3&,x\in [0,1)\\
1&,x\in [1,+\infty) \end{cases},$$
though it could be more complicated. Note that it isn't piecewise constant, nor continuous (there's a jump at $x=0$ for example).

If $X$ was absolutely continuous, I guess the simplest approach would be to take the derivative of $F$ to get the density and then integrate for the expectation.

If it was discrete, one could easily find the distribution law from the cdf itself, by seeing the size and location of jumps and then take the weighted sum for expectation.

However, I don't have an idea how to go about calculating the expectation of a mixed variable.

I should note that I'm not looking for the solution for the above example specifically, but a general method for solving the question at the top of the post.

Best Answer

Here's a careful derivation of the formula in Gautam Shenoy's answer:

If $X$ is a non-negative random variable, this well-known result: $$ \mathrm E(X)=\int_0^{+\infty}\mathrm P(X\gt t)\,\mathrm dt=\int_0^{+\infty}\mathrm P(X\geqslant t)\,\mathrm dt\tag1 $$ expresses the expectation of $X$ in terms of its CDF: $$ \mathrm E(X)=\int_0^{+\infty}[1 - F(t)]\,\mathrm dt\tag2 $$ To extend (2) to the general case where $X$ may take negative values, we can write $$E(X)=E(X^+)-E(X^-)\tag3$$ where the positive part and negative part of $X$ are defined by $$ X^+:=\begin{cases} X& \text{if $X>0$}\\ 0&\text{otherwise}\\ \end{cases}\tag4 $$ and $$ X^-:=\begin{cases} -X& \text{if $X<0$}\\ 0&\text{otherwise}\\ \end{cases}.\tag5 $$ Since both $X^+$ and $X^-$ are nonnegative, we can apply (1). Observe that for every $t>0$ $$ P(X^+>t)=P(X>t)=1-F(t)\tag6 $$ and $$P(X^-\ge t)=P(X\le -t)=F(-t).\tag7$$ Plugging these into (1) and using (3) gives $$ E(X)=\int_0^\infty[1-F(t)]dt-\int_0^\infty F(-t)dt.\tag8 $$ After a change of variable in the second integral we obtain the equivalent $$ E(X)=\int_0^\infty[1-F(t)]dt-\int_{-\infty}^0 F(t)dt.\tag9 $$

Related Question