Prove $X_n$ fullfills Weak Law of Large Numbers

law-of-large-numbersprobability theoryprobability-limit-theorems

Use following facts:

  • $\forall_{\epsilon>0} \mathbb{P}(X \geq \epsilon) \leq \frac{\mathbb{E}X}{\epsilon}$ (Chebyshev's inequality)
  • When $\lim_n n \mathbb{P}(|X| > n) =0$ and $X_n^* = X \mathbb{1}_{\{|X|\leq n\}}$ then $\lim_n \frac{\mathrm{Var(X_n^*)}}{n} = 0$ (proven here)

to prove:

Suppose $\lim_n n \mathbb{P}(|X| > n) =0$ and $X_n$ are i.i.d. with same distribution as X. Prove that $\{X_n\}$ fullfills the Weak Law of Large Numbers by showing:
$$
\lim_n \frac{S_n – n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n} \to 0 \text { in probability }
$$

where $S_n = \sum_{i=1}^k X_i$.

So far I have found out:

  • $0 = \lim_n n \mathbb{P}(|X|\geq n) \leq \mathbb{E}X$ but that gives me nothing useful

  • If we define $X_n^* = X \mathbb{1}_{\{|X|\leq n\}}$ then
    $$
    \lim_n \frac{S_n – n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n} = \frac{S_n – n\mathbb{E}X_n^*}{n} \geq \frac{S_n – n\cdot n\mathbb{P}(|X| \leq n)}{n} = \frac{S_n – n^2}{n}
    $$

    which is also rather pointless.

I appreciate any hints or solutions.

Best Answer

\begin{eqnarray} \begin{split} & \mathbb P\left(\left| \frac{S_n - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right| \geq \varepsilon\right)\cr & = \mathbb P\left(\left| \frac{S_n^* - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon, |X_1|\leq n,\ldots, |X_n|\leq n\right) \cr & + \mathbb P\left(\left| \frac{S_n - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon, |X_i|> n \text{ for some } i=1,\ldots, n\right) \cr & \leq \mathbb P\left(\left| \frac{S_n^* - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon\right)+n\mathbb P\left(|X_1|> n\right). \end{split}\tag{1} \end{eqnarray} The second term tends to zero. Consider the first term and apply Chebyshev's inequality. $$ \mathbb P\left(\left| \frac{S_n^* - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon\right) \leq \frac{\sum_{i=1}^n \text{Var}(X_i^*)}{n^2\varepsilon^2} = \frac{\text{Var}(X_1^*)}{n\varepsilon^2} \to 0. $$

Note also, that you are proved WLLN in the conditions that expectation $\mathbb EX$ need not to exist. The only thing that is given to you is the condition $n\mathbb P(|X| >n)\to 0$, which is more weak condition than $\mathbb E|X|<\infty$.

The hint to use Chebyshev's inequality did not mean that expectation exists. This is just a tool that can be applied in this problem to those random variables that have a expectation.

Addition: The inequality in (1) follows from the inequality $\mathbb P(A\cap B)\leq \mathbb P(A)$ (as well as $\mathbb P(B)$). So $$ \mathbb P\left(\left| \frac{S_n^* - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon, |X_1|\leq n,\ldots, |X_n|\leq n\right)\leq \mathbb P\left(\left| \frac{S_n^* - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon\right) $$ and $$ \mathbb P\left(\left| \frac{S_n - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon, |X_i|> n \text{ for some } i=1,\ldots, n\right) $$ $$\leq \mathbb P\left(\bigcup_{i=1}^n\{|X_i|> n\}\right)\leq \sum_{i=1}^n P\left(|X_i|> n\right)=n\mathbb P(|X_1|>n). $$

Related Question