Using Markov’s inequality to show that probability converges

random variables

Let $(X_n)_{n \geq 2}$ be a sequence of independent random variables such that

$X_n = n $ with probability $\frac{1}{2n \log(n)}$

$X_n = -n$ with probability $\frac{1}{2n \log(n)}$

$X_n = 0$ with probability $1-\frac{1}{n \log(n)}$

Let S := $X_2 + … + X_{n+1}$.

Why does it hold that for every $\epsilon > 0$ , $\mathbb P( | \frac{S_n}{n} – \mathbb E[\frac{S_n}{n}] | > \epsilon) \to 0$?

I thought that the Markov's inequality $\mathbb P( | \frac{S_n}{n} | > \epsilon) \leq \frac{\mathbb E[ | \frac{S_n}{n} |^2]}{\epsilon^2}$ would help, but does it really…?

Thanks for any help.

Best Answer

Use Chebyshev's inequality:

$$P(|S_n/n-\mu_n|>\epsilon)\leq \frac{\mbox{Var}(S_n/n)}{\epsilon^2}=\frac{\sum_{k=2}^n n/\log(n)}{\epsilon^2n^2}=...$$