Solved – Almost sure convergence and limiting variance goes to zero

convergencevariance

Say an estimator converges with probability one and at the same time its variance goes to zero in the limit. How is it different than an estimator that converges with probability one but its variance does not go to zero? Does that achieve sure convergence? I am wondering what difference it makes.

Best Answer

Convergence almost surely and convergence in distribution are not the same thing.

Take a simple example where you have i.i.d. mean zero random variables $Y_i$, $i= 1, 2, \cdots$ for which the law of iterated logarithm holds. Then

$$ \lim \sup_{n \rightarrow \infty} \frac{\frac{1}{\sqrt{n}}\sum_1^n Y_i}{\sqrt{\log\log n}} = \sqrt{2}, \; a.s. $$

In particular, $\frac{1}{\sqrt{n}}\sum_1^n Y_i$ diverges almost surely, but convergence in distribution. A sequence of random variables can diverge for a fixed sample path but converge in distribution across all sample paths.