Probability Theory – Meaning of Non-Existence of Expectation

measure-theoryprobability theory

When reading another post, I was wondering about the definition of existence of expectation of a random variable.

  1. From Kai Lai Chung,

    We say a random variable $X$ has a
    finite or infinite expectation (or
    expected value) according as $E(X)$ is
    a finite number or not. In the
    expected case we shall say that the
    expectation of X does not exist.

    I was wondering what it means by
    "the expected case" in the last
    sentence? Is this generally regarded
    as the meaning of non-existence of
    expectation?

  2. From Wikipedia:

    Let X be a discrete random variable.
    Then the expected value of this random
    variable is the infinite sum
    $$\operatorname{E}[X] = \sum_{i=1}^\infty x_i\, p_i, $$
    provided that this series converges
    absolutely (that is, the sum must
    remain finite if we were to replace
    all $x_i$'s with their absolute
    values). If this series does not
    converge absolutely, we say that the
    expected value of $X$ does not exist.

    I was wondering

    • if the meaning of nonexistence of expectation here is consistence with
      the one by Kai Lai Chung,
    • if the meaning of nonexistence of expectation here is consistence with
      the nonexistence of Lebesgue
      integral in Rudin's book
      where he says Legesgue integral
      of a real-valued
      Boreal-measurable function does not
      exist if and only if the integrals
      of the positive part and of the
      negative part are both infinite, which allow the integral to exist when it is infinite.
    • if the expectation is infinite, then the expectation is regarded
      as nonexistence?

Best Answer

With usual notation, decompose $X$ as $X=X^+ - X^-$ (also note that $|X|=X^+ + X^-$). $X$ is said to have finite expectation (or to be integrable) if both ${\rm E}(X^+)$ and ${\rm E}(X^-)$ are finite. In this case ${\rm E}(X) = {\rm E}(X^+) - {\rm E}(X^-)$. Moreover, if ${\rm E}(X^+) = +\infty$ (respectively, ${\rm E}(X^-) = +\infty$) and ${\rm E}(X^-)<\infty$ (respectively, ${\rm E}(X^+)<\infty$), then ${\rm E}(X) = +\infty$ (respectively, ${\rm E}(X) = -\infty$). So, $X$ is allowed to have infinite expectation.

Whenever ${\rm E}(X)$ exists (finite or infinite), the strong law of large numbers holds. That is, if $X_1,X_2,\ldots$ is a sequence of i.i.d. random variables with finite or infinite expectation, letting $S_n = X_1+\cdots + X_n$, it holds $n^{-1}S_n \to {\rm E}(X_1)$ almost surely. The infinite expectation case follows from the finite case by the monotone convergence theorem.

If, on the other hand, ${\rm E}(X^+) = +\infty $ and ${\rm E}(X^-) = +\infty $, then $X$ does not admit an expectation. In this case, must of the following must occur (a result by Kesten, see Theorem 1 in the paper The strong law of large numbers when the mean is undefined, by K. Bruce Erickson): 1) Almost surely, $n^{-1}S_n \to +\infty$; 2) Almost surely, $n^{-1}S_n \to -\infty$; 3) Almost surely, $\lim \sup n^{ - 1} S_n = + \infty$ and $\lim \inf n^{ - 1} S_n = - \infty$.

EDIT: Since you mentioned the recent post "Are there any random variables so that ${\rm E}[X]$ and ${\rm E}[Y]$ exist but ${\rm E}[XY]$ doesn't?", it is worth stressing the difference between "$X$ has expectation" and "$X$ is integrable". By definition, $X$ is integrable if $|X|$ has finite expectation (recall that $|X|=X^+ + X^-$). So, for example, the random variable $X=1/U$, where $U \sim {\rm uniform}(0,1)$, is not integrable, yet has (infinite) expectation (indeed, $\int_0^1 {x^{ - 1} \,{\rm d}x} = \infty $). Further, it is worth noting the following. A random variable $X$ is integrable (i.e., ${\rm E}|X|<\infty$) if and only if $$ \int_\Omega {|X|\,{\rm dP}} = \int_{ - \infty }^\infty {|x|\,{\rm d}F(x)} < \infty . $$ A random variable has expectation if and only if $$ \int_\Omega {X^ + \,{\rm dP}} = \int_{ - \infty }^\infty {\max \{ x,0\} \,{\rm d}F(x)} = \int_0^\infty {x\,{\rm d}F(x)} < \infty $$ or $$ \int_\Omega {X^ - \,{\rm dP}} = \int_{ - \infty }^\infty {-\min \{ x,0\} \,{\rm d}F(x)} = \int_{ - \infty }^0 {|x|\,{\rm d}F(x)} < \infty. $$ In any of these cases, the expectation of $X$ is given by $$ {\rm E}(X) = \int_0^\infty {x\,{\rm d}F(x)} - \int_{ - \infty }^0 {|x|\,{\rm d}F(x)} \in [-\infty,\infty]. $$ Finally, $X$ does not admit an expectation if and only if both $\int_\Omega {X^ + \,{\rm dP}} = \int_0^\infty {x\,{\rm d}F(x)}$ and $\int_\Omega {X^ - \,{\rm dP}} = \int_{ - \infty }^0 {|x|\,{\rm d}F(x)} $ are infinite. Thus, for example, a Cauchy random variable with density function $f(x) = \frac{1}{{\pi (1 + x^2 )}}$, $x \in \mathbb{R}$, though symmetric, does not admit an expectation, since both $\int_0^\infty {xf(x)\,{\rm d}x}$ and $\int_{ - \infty }^0 {|x|f(x)\,{\rm d}x}$ are infinite.

Related Question