Some doubts about proof of Strong Law of Large Numbers

expected valuelaw-of-large-numbersprobability theoryproof-explanationstrong-convergence

I quote Jacod-Protter.

Theorem:
Let $\left(X_n\right)_{n\geq1}$ be independent and identically distributed and defined on the same space. Let$$\mu=\mathbb{E}\{X_j\}$$ $$\sigma^2=\sigma_{X_j}^2<\infty$$ Let $S_n=\sum\limits_{j=1}^{n}X_j$. Then $$\lim\limits_{n\to\infty}\frac{S_n}{n}=\lim\limits_{n\to\infty}\sum\limits_{j=1}^{n}X_j=\mu\hspace{0.3cm}\text{ a.s. and in }L^2$$

Part of the proof:
(Assume $S_n=\sum\limits_{j=1}^{n}X_j$ and $Y_n=\frac{S_n}{n}$. After some passages, assuming, without loss of generality, that $\mu=0$, one gets to $\lim\mathbb{E}\{Y_n^2\}=0$, that is $Y_n$ converges to $0$ in $L^2$)

Since $Y_n$ converges to $0$ in $L^2$, there is a subsequence converging to $0$ a.s.
However we want the original sequence to converge a.s. To do this, we find a subsequence converging a.s. and then treat the terms in between successive terms of the subsequence.
Since $\mathbb{E}\{Y_n^2\}=\frac{\sigma^2}{n}$, let's choose the subsequence $n^2$; then $$\sum\limits_{n=1}^{\infty}\mathbb{E}\{Y_{n^2}^2 \}=\sum\limits_{n=1}^{\infty}\frac{\sigma^2}{n^2}<\infty$$
therefore we also know that $\sum\limits_{n=1}^{\infty}Y_{n^2}^{2}<\infty$ a.s., hence the tail of this convergent series converges to $0$; we conclude
\begin{equation}
\lim\limits_{n\to\infty}Y_{n^2}=0 \hspace{0.5cm}\text{a.s.}\tag{1}
\end{equation}


Next let $n\in\mathbb{N}$. Let $p(n)$ be the integer such that $p(n)^2\le n<\left(p\left(n\right)+1\right)^2$. Then $$Y_n-\frac{p(n)^2}{n}Y_{p(n)^2}=\frac{1}{n}{\displaystyle\sum\limits_{j=p(n)^2+1}^{n}X_j}$$ and $$\mathbb{E}\left\{\left(Y_n-\frac{p(n)^2}{n}Y_{p(n)^2}\right)^2\right\}=\frac{n-p(n)^2}{n^2}\sigma^2$$

$(…)$

Since, by $(1)$, $\lim\limits_{n\to\infty}Y_{p(n)^2}=0 \text{ a.s.}$ and since $\dfrac{p(n)^2}{n}\rightarrow 1$, $\lim\limits_{n\to\infty}Y_n=0\text{ a.s.}$ as well.

I have some doubts about the above-quoted proof (Herebelow I quote parts arising doubts, then I describe my doubts in detail, finally I summarize the doubts in bold):

1. $\text{"}$ therefore we also know that $\sum\limits_{n=1}^{\infty}Y_{n^2}^{2}<\infty$ a.s., hence the tail of this convergent series converges to $0$; we conclude
\begin{equation}
\lim\limits_{n\to\infty}Y_{n^2}=0 \hspace{0.5cm}\text{a.s. "}
\end{equation}


As far as I know, if $\sum\limits_{n=1}^{\infty}Y_{n^2}^{2}<\infty$ a.s., then $\lim\limits_{n\to\infty}Y_{n^2}^2=0 \hspace{0.2cm}\text{a.s.}$. Hence, why is the above conclusion on $Y_{n^2}$ (that is $\lim\limits_{n\to\infty}Y_{n^2}=0 \hspace{0.2cm}\text{a.s.}$) and NOT on $Y_{n^2}^2$ as I would expect (that is $\lim\limits_{n\to\infty}Y_{n^2}^2=0 \hspace{0.2cm}\text{a.s.}$)?

2.$\text{"}$ $\mathbb{E}\left\{\left(Y_n-\frac{p(n)^2}{n}Y_{p(n)^2}\right)^2\right\}=\dfrac{n-p(n)^2}{n^2}\sigma^2$ $\text{"}$

Why does it hold true? I know that, by linearity of expectation, $\mathbb{E}\left\{\left(Y_n-\frac{p(n)^2}{n}Y_{p(n)^2}\right)^2\right\}=\mathbb{E}\{Y_n^2\}+\mathbb{E}\left\{\dfrac{p(n)^4}{n^2}Y_{p(n)^2}^2\right\}-2\mathbb{E}\left\{Y_n\dfrac{p(n)^2}{n}Y_{p(n)^2}\right\}$
I know how to show that $\mathbb{E}\{Y_n^2\}=\dfrac{\sigma^2}{n}$ and $\mathbb{E}\left\{\dfrac{p(n)^4}{n^2}Y_{p(n)^2}^2\right\}=\dfrac{p(n)^2\sigma^2}{n^2}$.
However, I cannot show that $\mathbb{E}\left\{Y_n\dfrac{p(n)^2}{n}Y_{p(n)^2}\right\}=\dfrac{p(n)^2\sigma^2}{n^2}$ so as to get to the above conclusion $$\mathbb{E}\left\{\left(Y_n-\frac{p(n)^2}{n}Y_{p(n)^2}\right)^2\right\}=\dfrac{n-p(n)^2}{n^2}\sigma^2$$
Could you please show me why it does hold that $\mathbb{E}\left\{Y_n\dfrac{p(n)^2}{n}Y_{p(n)^2}\right\}=\dfrac{p(n)^2\sigma^2}{n^2}$?

3. $\text{"}$ $\dfrac{p(n)^2}{n}\rightarrow 1$ $\text{"}$

I understand the above-quoted result intuitively, but I cannot show it mathematically.
Could you please show me how to show that $\dfrac{p(n)^2}{n}\rightarrow 1$ as $n\to\infty$?

Best Answer

For (1), we are using the fact from real analysis:

If $(a_n)$ is a sequence of real numbers with $\lim a_n^2=0$, then also $\lim a_n=0$.

For (2), the entity $$ A_n:=Y_n-\frac{p(n)^2}{n}Y_{p(n)^2}=\frac{1}{n}{\displaystyle\sum\limits_{j=p(n)^2+1}^{n}X_j}\tag1 $$ is a constant ($\frac1n$) times the sum of IID random variables (the $X_j$) each with zero mean; so $A_n$ has zero mean. Conclude$$E(A_n^2)\stackrel{(a)}=\operatorname{Var}(A_n)\stackrel{(b)}=\frac1{n^2}\sum \operatorname{Var}(X_j)\stackrel{(c)}=\frac1{n^2}\sum\sigma^2,\tag2$$ where $(a)$ is the fact $E(A_n)=0$, $(b)$ is independence, and $(c)$ is the fact that each $X_j$ has the same variance $\sigma^2$. It remains to count the number of terms in the sum, which is $n-\rho(n)^2$.


EDIT: If you wish to show $\mathbb{E}\left\{Y_n\dfrac{p(n)^2}{n}Y_{p(n)^2}\right\}=\dfrac{p(n)^2\sigma^2}{n^2}$, write $V_n:=\dfrac{p(n)^2}{n}Y_{p(n)^2}$ so that $$Y_n\dfrac{p(n)^2}{n}Y_{p(n)^2}=Y_nV_n=(Y_n-V_n)V_n + V_n^2.\tag3$$ Note that $Y_n-V_n$ involves only $X_j$ for $j>\rho(n)^2$ while $V_n$ involves $X_j$ with $j\le\rho(n)^2$, hence $Y_n-V_n$ is independent of $V_n$. Since $V_n$ has mean zero, the expectation of (3) is $0+{\mathbb E}(V_n^2)$ which you've shown to equal $\dfrac{p(n)^2\sigma^2}{n^2}$.

Related Question