How can a positive random variable $X$ which never takes on the value $+\infty$, have expected value $\mathbb{E}[X] = +\infty$?
Infinite Expected Value of a Random Variable – Explanation
infinityprobabilityrandom variables
Related Solutions
Since the density function $f(x)$ is nonnegative, the integral formula for the expectation is really the difference of two integrals with nonnegative integrands (and hence nonnegative value): $$E[X] = \int_{-\infty}^{\infty} xf(x)\mathrm dx = \int_0^{\infty} xf(x)\mathrm dx - \int_{-\infty}^0 \vert x\vert f(x)\mathrm dx. $$ When both integrals are finite, their difference is finite too. If one of the integrals diverges but the other is finite, then some people say $E[X]$ exists but is unbounded while others deny the existence of $E[X]$ and say that $E[X]$ is undefined. (Perhaps this is why many theorems in probability avoid ambiguity by restricting themselves to random variables with finite means instead of random variables whose means exist.) If both integrals diverge, then the integral formula for $E[X]$ gives a result of the form $\infty - \infty$ and everybody agrees that $E[X]$ is undefined.
In summary, if $\int \vert x \vert f(x) dx$ is finite, then $\int x f(x) dx$ is also finite, and the value of the latter integral is called the expectation or expected value or mean of the random variable $X$ and denoted as $E[X]$, that is, $$E[X] = \int_{\infty}^{\infty} x f(x) dx.$$
Added Note: To my mind, the difference between saying that "$E[X] = \int xf(x) dx$ if the integral is finite" (as Sami wants to) and "$E[X] = \int xf(x) dx$ if $\int |x|f(x)\mathrm dx$ is finite" is that the second statement reminds the casual reader to check something instead of jumping to unwarranted conclusions. Many students have mistakenly calculated that a Cauchy random variable with density $[\pi(1+x^2)]^{-1}$ has expected value $0$ on the grounds that the integrand $x\cdot[\pi(1+x^2)]^{-1}$ in the integral for $E[X]$ is an odd function, and the integral is over a interval symmetric about the origin. But they would have discovered the error of their ways if they had carefully checked if $$\int_{-\infty}^{\infty} \vert x \vert \frac{1}{\pi(1+x^2)} dx = 2 \int_0^{\infty} x\frac{1}{\pi(1+x^2)} dx $$ is finite.
Let Y be a generic random variable, and let X be a binary random variable such that $P(X=1)=p$ and $P(X=2)=1-p$.
$$ E[Z] = E[Y^X|X=1]P(X=1) + E[Y^X|X=2]P(X=2) = E[Y|X=1]p + E[Y^2|X=2](1-p) $$
If $X$ and $Y$ are independent, then $E[Y|X=1]=E[Y]$ and $E[Y^2|X=2]=E[Y^2]$, which means that the above equation reduces to:
$$ E[Z] = E[Y]p + E[Y^2](1-p)$$
Best Answer
Let $X$ be a random variable that is equal to $2^n$ with probability $2^{-n}$ (for positive integer $n$). Then $${\mathbb E} X = \sum_{n=1}^\infty 2^{-n} \cdot 2^n = \sum_{n=1}^\infty 1 = \infty.$$
Cauchy Distribution is an example of a continuous distribution that doesn't have an expectation.