[Math] Convergence of sample mean of r.v.s in probability

convergence-divergenceprobabilityrandom variables

Let $X_1,X_2\dots$ a sequence of random variable which converges in probability to zero i.e. $X_n\overset{p.}\rightarrow0$. I am trying to find a counterexample in order to prove that $\frac{\sum\limits_{i=1}^nX_i}{n}$ does not necessarily converge in probability to zero.

So it can shown that if for every sample point $\omega\in\Omega$, I choose
$$
X_n(\omega)=
\left\{
\begin{array}{ll}
n,\qquad U(\omega)\leq1/n&\\
0,\qquad \text{otherwise}&
\end{array}
\right.
$$
for $U$ a uniform r.v. in $[0,1]$ then $X_n\overset{a.s.}\rightarrow0$ and $X_n\overset{p.}\rightarrow0$. Also I am thinking that in this case
$$\frac{\sum\limits_{i=1}^nX_i}{n}=\frac{n^2}{n}=n$$ which clearly diverges but I have doubts about this. Any ideas or other examples?

Best Answer

Let $(X_n)$ be independent random variables such that $P(X_n=1)=\frac{1}{n}$ and $P(X_n=0)=1-\frac{1}{n}$. Denote $Y_n = 2^n X_n$. It is easy to see $Y_n \to 0$ in probability. Taking $\epsilon =1$ and $k_n =\min\{ i : 2^i \geqslant n\}$, we have \begin{align*} P\Big(\sum_{i=1}^n Y_i \geqslant n\Big) &\geqslant P\Big(\sum_{i=k_n}^n Y_i \geqslant n\Big) = P\Big(\sum_{i=k_n}^n 2^i X_i \geqslant n\Big) = P\Big(\bigcup_{i=k_n}^n (X_i = 1)\Big)\\ & =1 - P\Big(\bigcap_{i=k_n}^n (X_i = 0)\Big) = 1- \prod_{i=k_n}^n P(X_i=0)=1-\prod_{i=k_n}^n \Big(1-\frac{1}{i}\Big)\\ & \geqslant 1- \exp\Big(-\sum_{i=k_n}^n \frac{1}{i}\Big). \end{align*} Since $k_n \sim \log n$, then $\sum_{i=k_n}^n \frac{1}{i} \sim \log\Big(\frac{n}{\log n}\Big)\to \infty$ as $n\to \infty$. This implies that the last term in the estimation above tends to $1$. Hence, $\frac{1}{n}\sum_{i=1}^n Y_i$ does not converge to $0$ in probability.

Related Question