[Math] Convergence of product of random variables with distribution $U(0, e)$

law-of-large-numbersprobability theoryprobability-limit-theoremsrandom variables

Let $X_1, X_2, \ldots$ be a sequence of independent random variables with uniform distribution on $[0, e]$. Let $R_n:=\prod_{k=1}^n X_k$. By Kolmogorov's zero–one law $(R_n)$ converges with probability $0$ or $1$. I want to determine this probability.

Observe that $\log(R_n)=\sum_{k=1}^n \log(X_k)$ with $\mathbb{E}[\log(X_k)]=0$ so, by the strong law of large numbers, $\log(R_n)/n\to 0$ almost surely. Moreover, by Kolmogorov's three-series theorem, $\log(R_n)$ converges almost never. Unfortunately this does not imply that $R_n$ converges almost never. Any ideas?

Best Answer

The sequence is almost surely non-convergent.

Assume the sequence was convergent. By the Hewitt-Savage zero-one law the limiting random variable has to be constant.

If the limit isn't zero, we can apply the continuous mapping theorem and we get that $\log(R_n)$ converges almost surely to a constant. This is a contradiction, since $\frac{\log(R_n)}{\sqrt{n}}$ converges weakly to a Gaussian distribution by the central limit theorem.

If the limit is zero, we use again that $\frac{\log(R_n)}{\sqrt{n}}$ converges to a (centered) Gaussian random variable. This implies that $P(R_n > 1) = P(\log(R_n) > 0)$ converges to $\frac{1}{2}$, in contradiction to the almost sure convergence of $R_n$ to $0$.

Edit: This can actually be simplified significantly. Since $\frac{\log(R_n)}{\sqrt{n}}$ converges weakly to a centered nondegenerate Gaussian random variable, the probability $P(n^{-1/2} \log(R_n) > 1) = P(R_n > \exp(\sqrt{n}))$ converges to a stricly positive number. But this means that $R_n$ isn't tight and therefore doesn't even have a weakly convergent subsequence.