A monotone sequence of random variables converge almost surely

convergence-divergenceprobability theory

Let $0\leq X_1\leq X_2\leq\dots$ be random variables such that $\mathrm{E}(X_n)= n$ for all $n$. Assume that $\mathrm{Var}(X_n)\leq Cn^r$ for all $n$ where constants $C>0$ and $0<r<2$. Show that $X_n/n\rightarrow 1$ a.s.

My attempt: Fix $\epsilon>0$ . By Chebyshev's inequality we have
$$
\mathbb{P}(|X_n/n-1|\geq \epsilon)\leq \frac{\mathrm{Var}(X_n/n)}{\epsilon^2}=\frac{\mathrm{Var}(X_n)}{\epsilon^2n^2}\leq \frac{C}{\epsilon^2n^{2-r}}.
$$

It follows that $X_n/n\rightarrow 1$ in probability. How do we take advantage of the monotonicity of $\{X_n\}_{n=1}^\infty$ and prove pointwise convergence?

Best Answer

You have shown that

$$\mathbb{P} \left( \left| \frac{X_{n}}{n} -1 \right| \geq \epsilon \right) \leq \frac{C}{\epsilon^2 n^{(2-r)}} \tag{0}$$

for each $n \in \mathbb{N}$, and this yields $X_n/n \to 1$ in probability. Consequently, there exists a subsequence, say, $(X_{n_k}/n_k)_{k \geq 1}$ which converges almost surely to $1$. One could hope to use some sandwich argument (based on the monotonicity of the $X_n$) to deduce that $X_n/n \to 1$ almost surely. The problem is that this approach will require some information about the growth of $n_k$ - and this, we don't have. To get rid of this problem, we will apply $(0)$ directly to a subsequence of our choice.

For fixed $k \in \mathbb{N}$, it follows from $(0)$ that $$\mathbb{P} \left( \left| \frac{X_{n^k}}{n^k} -1 \right| \geq \epsilon \right) \leq \frac{\text{var}(X_{n^k}/n^k)}{\epsilon^2} = \frac{\text{var}(X_{n^k})}{\epsilon^2 n^{2k}} \leq \frac{C}{\epsilon^2 n^{k(2-r)}}.$$

For given $r \in (0,2)$, we can choose $k=k(r)$ sufficiently large such that

$$\sum_{k \geq 1} \mathbb{P} \left( \left| \frac{X_{n^k}}{n^k} -1 \right| \geq \epsilon \right) \leq \frac{C}{\epsilon^2 } \sum_{k \geq 1}\frac{1}{n^{k(2-r)}} < \infty.$$

By Borel Cantelli, this implies $X_{n^k}(\omega)/n^k \to 1$ for almost all $\omega$. Fix any such $\omega$ and $\epsilon>0$. Then there exists a number $N \in \mathbb{N}$ such that

$$1-\epsilon \leq \frac{X_{n^k}(\omega)}{n^k} \leq 1+\epsilon \quad \text{and} \quad 1-\epsilon \leq \frac{(n+1)^k}{n^k} \leq 1+\epsilon \quad \text{for all $n \geq N$.} \tag{1}$$ Now pick some $m \geq N^k$ and choose $n \geq N$ such that $m \in [n^k,n^{k+1})$. Since the sequence $X_i$ is monotone, we have

$$\frac{X_{n^k}(\omega)}{n^k} \leq \frac{X_m(\omega)}{n^k} = \frac{X_m(\omega)}{m} \frac{m}{n^k} \leq \frac{X_m(\omega)}{m} \frac{(n+1)^k}{n^k},$$

i.e.

$$\frac{n^k}{(n+1)^k} \frac{X_{n^k}(\omega)}{n^k} \leq \frac{X_m(\omega)}{m}. \tag{2}$$

On the other hand, we also have

$$\frac{X_m(\omega)}{m} \leq \frac{X_{(n+1)^k}(\omega)}{m} = \frac{X_{(n+1)^k}(\omega)}{(n+1)^k} \frac{(n+1)^k}{m} \leq \frac{X_{(n+1)^k}(\omega)}{(n+1)^k} \frac{(n+1)^k}{n^k}.$$

Combining this with $(2)$, we get

$$\frac{n^k}{(n+1)^k} \frac{X_{n^k}(\omega)}{n^k} \leq \frac{X_m(\omega)}{m} \leq \frac{X_{(n+1)^k}(\omega)}{(n+1)^k} \frac{(n+1)^k}{n^k}.$$

Hence, by $(1)$,

$$\frac{1}{1+\epsilon} \frac{1}{1-\epsilon} \leq \frac{X_m(\omega)}{m} \leq (1+\epsilon)^2.$$

As $\epsilon>0$ is arbitrary, this proves $X_m(\omega)/m \to 1$ as $m \to \infty$. Hence, $X_m/m\to 1$ almost surely.