I might just be misunderstanding the notation here but I‘m supposed to show that for $(X_n)$ a sequence of i.i.d. uniformly distributed random variables on $[0,1]$ we have that:
$$
\lim \sup_{n \rightarrow \infty} \frac{1}{\ln(n)}\ln(\frac{1}{X_n}) = 1
$$ almost surely.
For me this means that I need to show:
$$
\mathbb{P}(\{\omega \in \Omega : \lim \sup_{n \rightarrow \infty} \frac{1}{\ln(n)}\ln(\frac{1}{X_n(\omega)}) = 1\}) = 1
$$
Where the $\lim \sup_{n \rightarrow \infty}$ is taken pointwise. I‘m supposed to show this using the Borel-Cantelli lemmas but these lemmas only allow me to make statements about the following probability:
$$ \mathbb{P}(\lim \sup_{n \rightarrow \infty} \{\omega \in \Omega : \frac{1}{\ln(n)}\ln(\frac{1}{X_n(\omega)}) = 1\}) = 1
$$
But these two sets aren’t equal, are they? Am I just understanding the exercise incorrectly and do I actually need to show the second statement or am I misunderstanding something?
Best Answer
In general, let $Y_n$ be a sequence of independent random variables. The random variable $\limsup_{n\to\infty} Y_n$ is in the tail sigma algebra, which means it is always a constant random variable. In order to show that $\limsup_{n\to\infty} Y_n$ is equal to the constant $L$, you must show two things. For all $\newcommand{\e}{\varepsilon}\e>0$,
$\mathbb P(Y_n>L+\e\;\text{ i.o})=0$,
$\mathbb P(Y_n>L-\e\;\text{ i.o.})=1$.
Applying this to $Y_n:= \ln(1/X_n)/\ln(n)=-\ln X_n/\ln n$, this means you need to show that $$ P\left(\frac{-\ln X_n}{\ln n}> 1\pm \e\;\text{ i.o.}\right) $$ is either $0$ or $1$, depending on the $\pm$. Note $$ P\left(\frac{-\ln X_n}{\ln n}>1\pm \e\right)=P(X_n < n^{-(1\pm \e)})=n^{-(1\pm \e)} $$ Since the series $n^{-(1\pm \e)}$ is summable for $+$ and infinite for $-$, you can use the Borel-Cantelli lemmas to conclude that $P\left(\frac{-\ln X_n}{\ln n}> 1\pm \e\;\text{ i.o.}\right)$ is $0$ for $+$ and is $1$ for $-$.
I see now what you were really asking about; you were confused about how the Borel-Cantelli lemmas deal with the lim sup of events, but you want to know how to apply that to the lim sup of random variables. Let my try to make this clear:
This lemma connects the $\limsup$ of random variables to the complement of the $\limsup$ of certain events.
Similarly, you can prove:
This is the entire story of about how $\limsup$ of events relates to the $\limsup$ of events. To connect this to the first part of my answer, recall that i.o. = infinitely often, which is the same as $\limsup$. That is, $$ \{A_n\text{ i.o.}\}=\limsup A_n. $$