Strong Law of Zero Mean i.i.d Random Variables with a Bounded Sequence of Non-Random Constants

alternative-prooflaw-of-large-numbersprobability theory

This question regards Theorem 1.8.6 on Durrett page 52 which states as

(The strong law of large numbers) Let $X_{1}, X_{2},\cdots$ be i.i.d random variables with $E|X_{i}|<\infty$. Let $E(X_{i})=\mu$ and $S_{n}=X_{1}+\cdots+X_{n}$. Then $S_{n}/n\longrightarrow \mu$ a.s.

The proof it requires at least two lemmas and Kolmogorov's One Series Theorem, I have read through them without problems.

However, I am thinking about a little transformation of this law. What if we set $E(X_{i})=0$ and there is a bounded sequence of non-random constants $c_{n}$ and we let $S_{n}:=c_{1}X_{1}+\cdots+c_{n}X_{n}$, will $S_{n}/n$ converges to $0$ almost surely? That is:

Let $X_{1},X_{2},\cdots$ be i.i.d integrable random variables with $E(X_{i})=0$. If $c_{n}$ is a bounded sequence of non-random constants, and we set $S_{n}:=c_{1}X_{1}+\cdots+c_{n}X_{n}$, show that $S_{n}/n\longrightarrow 0$ a.s.

I've been thinking about using the similar proof of the strong law, since the strong law is just let $c_{n}=1$ for all $n$, and $\mu\neq 0$.

So firstly I tried to directly make $Y_{n}:=c_{n}X_{n}$ for each $n$, and argue just for $Y_{n}$. The good thing here is that $E(Y_{n})=0$ so we don't change the limit in the almost sure convergence, however, since $c_{n}$ are different, even though $Y_{n}$'s are still independent, they are not identically distributed any longer.

How could I make all those different $c_{n}$ to be one thing? (so that in this way they are i.i.d again). Or perhaps I am heading on a wrong direction?

Thank you in advance for any discussion, hint, or solution!

Edit 1:

Since $c_{n}$ is bounded, $c_{n}\leq M$ for all $n$. Thus, if replace $Y_{n}:=MX_{n}$, and write $Z_{n}:=Y_{1}+\cdots+Y_{n}$ then they are still i.i.d, and then I can definitely show that $$\dfrac{Z_{n}}{n}\longrightarrow\mu\ \text{a.s.}$$

Now, note that $S_{n}:=c_{1}X_{1}+\cdots+c_{n}X_{n}\leq Z_{n}$, so this problem can be reduced to if if $Z_{n}/n\longrightarrow 0$ a.s. and $Z_{n}/n\geq S_{n}/n$, then $S_{n}/n\longrightarrow 0$ a.s.

I don't really know if this is true… If it is, how could I prove it?

Edit 2:

Okay I figured it out. The point here is that even though $c_{k}X_{k}$ is not i.i.d, you can still use i.i.d when you have $P(|c_{k}X_{k}|>n)$, since you can directly divided by $|c_{k}|$. Since it is bounded, everything will be fine.

For details, please see my answer of my own post.

Edit 3:

Since I noticed that some users voted me and favorite this post during me writing the proof in my answer, I make an edit here to let the system alert you that there is an edit so that you could see my answer. Thank you for your vote and favorite 🙂 Enjoy my proof!

Best Answer

Below is what I wanted to prove

Let $(X_{k})$ be integrable i.i.d random variables such that $E[X_{k}]=0$. Show that if $c_{n}$ is a bounded sequence of non-random constants, then $$n^{-1}\sum_{k=1}^{n}c_{k}X_{k}\longrightarrow 0\ \text{a.s.}$$

To prove this, we need two lemmas and Kolmogorov's One Series Theorem, which are stated below:

Lemma 1: (Kronecker's Lemma) Let $\{x_{n}\}$ and $\{a_{n}\}$ be two sequences of real numbers such that $a_{n}>0$ and $a_{n}\nearrow\infty$. Show that if $\sum_{n=1}^{\infty}x_{n}/a_{n}$ converges, then $$a_{n}^{-1}\sum_{m=1}^{n}x_{m}\longrightarrow 0.$$


Lemma 2: Show that $\sum_{k=1}^{\infty}k^{-2}Var(X_{k}\mathbb{1}_{(|X_{k}|\leq k)})\leq 4E|X_{1}|.$


Kolmogorov's One Series Theorem: Suppose $X_{1},X_{2},\cdots$ are independent and have $EX_{n}=0$. If $$\sum_{n=1}^{\infty}Var(X_{n})<\infty,$$ then we have $\sum_{n=1}^{\infty}Var(X_{n})$ converges a.s.


I will firstly use these three things to prove my statement, and then I will prove the first two lemmas. For the one series theorem, the proof of it is everywhere.


Proof of Statement:

Set $S_{n}:=c_{1}X_{1}+\cdots+c_{n}X_{n}$, $Y_{k}:=(c_{k}X_{k})\mathbb{1}_{(|c_{k}X_{k}|\leq k)}$, and $T_{n}:=Y_{1}+\cdots+Y_{n}.$

It then suffices to show that $T_{n}/n\longrightarrow 0$. Indeed, since $c_{k}$ is a bounded sequence, there exists $N$ such that $|c_{k}|\leq N$ for all $k$, and thus if $\omega\in\Omega$ satisfied $|X_{k}|>k/|c_{k}|$, it must satisfied $|X_{k}|>k/N$, and thus by monotonicity, we have $$\sum_{k=1}^{\infty}P(|c_{k}X_{k}|>k)\leq\sum_{k=1}^{\infty}P(|X_{k}|>k/N)\leq\int_{0}^{\infty}P(|X_{1}|>t)dt=E|X_{1}|<\infty,$$ which implies that $$P(c_{k}X_{k}\neq Y_{k}\ \text{i.o.})=0,$$ and thus $$|S_{n}(\omega)-T_{n}(\omega)|\leq R(\omega)<\infty\ \text{a.s. for all}\ n.$$

Secondly, glance at the proof of Lemma 2, we know that the inequality can be modified as $$\sum_{k=1}^{\infty}Var(Y_{k}/k^{2})\leq 4|c_{k}|E|X_{1}|<\infty,$$ where it is finite since $c_{k}$ is bounded and $X_{1}$ is integrable.

Now, set $Z_{k}:=Y_{k}-EY_{k}$, so that $E(Z_{k})=0$ and thus $Var(Z_{k})=Var(Y_{k})$. By Lemma 2, we have $$\sum_{k=1}^{\infty}Var(Z_{k})/k^{2}=\sum_{k=1}^{\infty}Var(Y_{k})/k^{2}\leq 4|c_{k}|E|X_{1}|<\infty,$$ and thus by Kolmogorov's One Series Theorem, we conclude that $\sum_{k=1}^{\infty}Z_{k}/k$ converges a.s., so use Lemma 1, we have $$n^{-1}\sum_{k=1}^{n}(Y_{k}-EY_{k})\longrightarrow 0,$$ and thus $$\dfrac{T_{n}}{n}-n^{-1}\sum_{k=1}^{n}EY_{k}\longrightarrow 0\ \text{a.s.}$$

Now, since $|c_{k}|$ is bounded, we can use dominated convergence theorem to get $EY_{k}\longrightarrow 0\ \text{as}\ k\longrightarrow\infty,$ and thus it follows that $$n^{-1}\sum_{k=1}^{n}EY_{k}\longrightarrow 0,$$ and hence $$T_{n}/n\longrightarrow 0.$$


Proof of Lemma 1:

By the setting of this problem, I believe it means that $a_{n}>0$ for $n\geq 1$, so let us set $a_{0}:=0$. For $m\geq 1$, define $b_{m}:=\sum_{k=1}^{m}x_{k}/a_{x}$, and define $b_{0}=0$. Then, $x_{m}=a_{m}(b_{m}-b_{m-1}).$ Therefore, we have (don't forget $a_{0}=b_{0}=0$): \begin{align*} a_{n}^{-1}\sum_{m=1}^{n}x_{m}&=a_{n}^{-1}\sum_{m=1}^{n}a_{m}(b_{m}-b_{m-1})\\ &=a_{n}^{-1}\Big(\sum_{m=1}^{n}a_{m}b_{m}-\sum_{m=1}^{n}a_{m}b_{m-1}\Big)\\ &=a_{n}^{-1}\Big(a_{n}b_{n}+\sum_{m=2}^{n}a_{m-1}b_{m-1}-\sum_{m=1}^{n}a_{m}b_{m-1}\Big)\\ &=b_{n}-\sum_{m=1}^{n}\dfrac{(a_{m}-a_{m-1})}{a_{n}}b_{m-1}. \end{align*}

By hypothesis, $b_{n}\longrightarrow b_{\infty}<\infty$ as $n\longrightarrow\infty$. It then remains for us to show that $$\sum_{m=1}^{n}\dfrac{(a_{m}-a_{m-1})}{a_{n}}b_{m-1}\longrightarrow b_{\infty}.$$

Let $\epsilon>0$, set $B:=\sup|b_{n}|$, and choose an $M$ such that for all $m\geq M$, we have $|b_{m}-b_{\infty}|<\epsilon/2$. Also, pick an $N$ such that $a_{M}/a_{n}<\epsilon/4B$ for all $n\geq N$.

Now, for $n\geq N$, recalling $a_{m}- a_{m-1}\geq 0$, and noting $\sum_{m=1}^{n}\dfrac{(a_{m}-a_{m-1})}{a_{n}}=1,$ we have: \begin{align*} \Big|\sum_{m=1}^{n}\dfrac{(a_{m}-a_{m-1})}{a_{n}}b_{m-1}-b_{\infty}\Big|&\leq\sum_{m=1}^{n}\dfrac{(a_{m}-a_{m-1})}{a_{n}}|b_{m-1}-b_{\infty}|\\ &\leq \dfrac{a_{M}}{a_{n}}\cdot 2B+\dfrac{a_{n}-a_{M}}{a_{n}}\cdot\dfrac{\epsilon}{2}\\ &<\epsilon. \end{align*}

The result follows immediately by taking $\epsilon\searrow 0$.


Proof of Lemma 2:

Set $Y_{k}:=X_{k}\mathbb{1}_{(|X_{k}|\leq k)}$.

Firstly, we observe that $$Var(Y_{k})\leq E[Y_{k}^{2}]=\int_{0}^{\infty}2yP(|Y_{k}|>y)dy\leq\int_{0}^{k}2yP(|X_{1}|>y)dy,$$ then since everything is $\geq 0$ and the sum is just an integral with respect to counting measure on $\{1,2\cdots\}$, we can use Fubini's Theorem to yield: \begin{align*} \sum_{k=1}^{\infty}\dfrac{E(Y_{k}^{2})}{k^{2}}&\leq\sum_{k=1}^{\infty}k^{-2}\int_{0}^{\infty}\mathbb{1}_{(y<k)}2yP(|X_{1}|>y)dy\\ &=\int_{0}^{\infty}\Big(\sum_{k=1}^{\infty}k^{-2}\mathbb{1}_{(y<k)}\Big)2yP(|X_{1}|>y)dy \end{align*}

Now, since $E|X_{1}|=\int_{0}^{\infty}P(|X_{1}|>y)dy$, the result follows immediately from the lemma below.

Lemma 2.1: If $y\geq 0$, then $2y\sum_{k>y}k^{-2}\leq 4$.

Proof of Lemma 2.1:

Note that if $m\geq 2$, we then have $$\sum_{k\geq m}k^{-2}\leq\int_{m-1}^{\infty}x^{-2}dx=(m-1)^{-1}.$$

Now, when $y\geq 1$, the sum starts at $k=[y]+1\geq 2$. Also, note that $y/[y]\leq 2$ for $y\geq 1$, since the worst case is $y$ being really close to $2$. Thus, we have $$2y\sum_{k>y}k^{-2}\leq 2y/[y]\leq 4.$$

Finally, for $0\leq y<1$, we have $$2y\sum_{k>y}k^{-2}\leq 2\Big(1+\sum_{k=2}^{\infty}k^{-2}\Big)\leq 4.$$


The whole proof now is complete. Please let me know if there is any problem in my proof. Enjoy :)

Related Question