[Math] Convergence in Probability for sum of $X_i^2$

convergence-divergenceinequalityprobability

enter image description here

So I know that I have to find $\textbf{E}[Y_n]$ and $\text{var}(Y_n)$ and then use Chebyshev's inequality to prove this question, but I'm having trouble deriving $\textbf{E}(Y_n)$ and $\text{var}(Y_n)$. I think I got $\textbf{E}(Y_n)$,
which is $$ \textbf{E}(X_i^2) = \text{var}(X_i) + (\textbf{E}(X_i))^2.$$

Hopefully this is correct. I do not know how to get $\text{var}(Y_n)$ after this though as it requires finding $\textbf{E}(Y_n^2)$.

I think end result should look something like
$$P(|Y_n-\textbf{E}(Y_n)|>\epsilon) < \frac{\text{var}(Y_n)}{\epsilon^2}.$$

And then obviously I would write $\textbf{E}(Y_n)$ and $\text{var}(Y_n)$ in terms of $\textbf{E}(X_1)$ and $\text{var}(X_1)$ as I am asked to do so in the question, and finally prove that the right hand side = $0$ as $n \rightarrow \infty$.

Best Answer

I think this is the easiest approach

We want to apply the strong law, which says that for i.i.d. $\xi_n$, $\frac{1}{n}\sum \xi_n \rightarrow E\xi_1$ a.s. as long as $E|\xi_1| < \infty$. Now let $\xi_n = X_n^2$, we can apply the law of large numbers as long as$EX_1^2 < \infty$. However, we know that $\text{Var}(X_1) = EX_1^2 - (EX_1)^2$ and therefore $EX_1^2 = \text{Var}(X_1) + (EX_1)^2 < \infty$ and we can apply the law of large numbers to see that $\frac{1}{n} \sum X_n^2 \rightarrow EX_1^2$ a.s., but we already know the that $EX_1^2 = \text{Var}(X_1) + (EX_1)^2$ and we are done

Related Question