Show that $Y_n := (\prod_{i=1}^{n} X_i)^{1/n}$ converges with probability 1

convergence-divergenceprobability theory

I'm dealing with a problem about stochastic and statistics and hope some of you can help me!

On $[0,1]$ we have a sequence of independent, equally distributed probability variables $(X_n)_{n \in \mathbb{N}}$. I have to show:

a) $Y_n := (\prod_{i=1}^{n} X_i)^{1/n}$ converges with probability $1$.

b) calculate the exact limit of $Y_n$

I've already done some calculations, but I'm really not sure, whether everything is fine.


Some pre-considerations:
To get rid of the product I took the logarithm: $\ln(Y_n) = \frac{1}{n} \sum_{i=1}^{n} \ln(X_i)$

After taking the logarithm the sequence $\ln(X_i)$ still is equally distributed and independent.

a) I found a theorem in my lecture notes, which states, that $\frac{S_n}{n}$ (the $n$-th partial sum of a sequence) converges and has a finite limit with probability $1$ if the sequence is integrable.

It seems to me, that this Theorem might fit, but my concern is, that the logarithm of $X_i = 0$ (allowed since $X_i$ is a sequence on $[0,1]$) isn't integrable.

b) This part some kind of "smells" to me like Kolmogorov's law, which states that for a sequence of indempendent and identically distributed probability variables with finite expectation value it holds:
$$\lim_{n\rightarrow \infty} \frac{1}{n} \sum_{k=1}^{n}X_k = \mathbb{E}(X_1) \quad \text{a.s.}$$

So the limit would be $\lim_{n} \ln(Y_n) = \mathbb{E}(\ln(X_1))$ almost sure.

But I don't see, why the expectation value of $\ln(X_i)$ should be finite for $X_i = 0$ again.

So due to this concerns at $X_i = 0$ I'm not sure, whether I'm on the right track, or the problem needs to be solved differently.

I would be very grateful if some of you can help me!

Thanks in Advance!

pcalc

Best Answer

If $\mathbb{E}(-\log(X_1))<\infty$ then your reasoning works fine and we find that

$$Y_n \to \exp(\mathbb{E}\log(X_1)) \quad \text{almost surely}. \tag{1}$$

Now consider the case $\mathbb{E}(-\log(X_1))=\infty$. Define a sequence of truncated random variables by

$$Z_n^{(k)} := \min\{k, -\log(X_n)\}= \begin{cases} - \log(X_n), & 0 \leq -\log(X_n) \leq k, \\ k, & \text{otherwise}. \end{cases}$$

The sequence $(Z_n^{(k)})_{n \in \mathbb{N}}$ is independent and identically distributed. Since $\mathbb{E}|Z_n^{(k)}| \leq k < \infty$, the strong law of large numbers gives

$$\lim_{n \to \infty} \frac{1}{n} \sum_{j=1}^n Z_j^{(k)} \xrightarrow[]{n \to \infty} \mathbb{E}(Z_1^{(k)}) \tag{1}$$

almost surely. Since $Z_j^{(k)} \leq - \log(X_j)$ for each $j \in \mathbb{N}$ this implies

$$\liminf_{n \to \infty}\frac{1}{n} \sum_{j=1}^n -\log(X_j) \geq \mathbb{E}(Z_1^{(k)})$$

for all $k \in \mathbb{N}$. Since the monotone convergence theorem gives $\sup_k \mathbb{E}(Z_1^{(k)}) = \mathbb{E}(-\log(X_1))=\infty$ we get

$$\liminf_{n \to \infty} \frac{1}{n} \sum_{j=1}^n -\log(X_j) = \infty$$ i.e.

$$\limsup_{n \to \infty} \frac{1}{n} \sum_{j=1}^n \log(X_j) = -\infty$$

almost surely. Hence, by the continuity of the exponential function,

$$Y_n = \exp\left( \frac{1}{n} \sum_{j=1}^n \log(X_j) \right) \xrightarrow[]{n \to \infty} 0$$

almost surely.


In summary, we get

$$Y_n \to \exp(\mathbb{E}\log(X_1)) \quad \text{a.s.}$$

with $\mathbb{E}\log(X_1)$ being possibly $-\infty$.


Remark: We have actually proved the following converse of the strong law of large numbers:

Let $(U_j)_{j \in \mathbb{N}}$ be a sequence of independent identically distributed and non-negative random variables. If $\mathbb{E}(U_1)=\infty$ then $$\liminf_{n \to \infty} \frac{1}{n} \sum_{j=1}^n U_j = \mathbb{E}(U_1)=\infty \quad \text{a.s.}.$$

Related Question