Chebishev’s and Markov’s inequalities and convergence a.s.

almost-everywhereconvergence-divergence

Let ${X_n}$ be a sequence of i.i.d random variables with mean = 1 and variance $\sigma^2 =3$.
Show if the following converges almost surely:
$$
\frac{1}{n}\sum^n X_i \longmapsto 1
$$

Chebyshev Inequality:

$$
\mathcal{P}(|\sum^n X_i – 1| > \epsilon*n)
\!
\begin{aligned}[t]
& \leq \frac{1}{n^2*\epsilon^2}* V(\sum^n X_i) \\
& = \frac{n*3}{n^2*\epsilon^2} \\
& = \frac{3}{n*\epsilon^2} \\
& = \frac{3}{\epsilon^2}\sum \frac{1}{n} = \infty
\end{aligned}
$$

Therefore for Borel Cantelli Lemma, there's no convergence almost surely.

However, using Markov's inequality:
$$
\mathcal{P}(|\sum^n X_i| > \epsilon*n)
\!
\begin{aligned}[t]
& \leq \frac{1}{n*\epsilon}* E(\sum^n X_i) \\
& = \frac{n}{n*\epsilon} \\
& = \frac{1}{\epsilon} \\
& = \sum \frac{1}{\epsilon} < \infty
\end{aligned}
$$

In this case, using Markov and BCT, there's the convergence almost surely.

May someone spot any mistake or more specifically, explain to me why should I use Chebyshev or Markov and why they give me different results?

Best Answer

Both your approaches look peculiar. I would have thought you wanted to say:

  • $\sum\limits_{i=1}^n X_i$ has mean $n$ and variance $3n$

  • so $\frac{1}{n}\sum\limits_{i=1}^n X_i$ has mean $1$ and variance $\frac{3}{n}$,

  • with the result that Chebyshev's inequality would give: $$\mathbb P\left(\left|\sum\limits_{i=1}^n X_i - n\right|>n\epsilon\right)=\mathbb P\left(\left|\frac{1}{n}\sum\limits_{i=1}^n X_i - 1\right|>\epsilon\right) < \frac3{n\epsilon^2}$$ and $\frac3{n\epsilon^2} \to 0$ as $n$ increases, so you get convergence in probability of the sample mean to the population mean, equivalent to the weak law of large numbers.

This does not give the strong law of large numbers (convergence almost surely) since $\sum\limits_{n=1}^\infty \frac3{n\epsilon^2}$ is not finite. Instead you need a more sophisticated approach such as this or this or this.