We define the independent random variables $(X_n)_{n\in \mathbb N}$, with density functions $$f_{X_n}=\frac 2 {\lambda_n^2}x\cdot\mathcal X_{(0,\lambda_n)}(x).$$
First off, I had to show that the events $A:=\{X_n\leq \frac 1 3\lambda_n\}$ and $B:=\{X_n\geq \frac 2 3\lambda_n\}$ occurr an infinite number of times, and here I had no problems. Then, I showed that the succession converges in law to a variable $X$ when $\lambda_n \rightarrow\lambda\lt\infty$, and finally I must show that the succession converges almost surely. The solutions of my text say that, definitely, we have that $X\geq \frac 2 3\lambda$ and $X\leq \frac 1 3\lambda$, and this is impossible so there's no almost sure convergence. However the $X_n$ don't converge to a constant random variable, so I don't see why the succession cannot assume infinite values both lower than $\frac 1 3\lambda$ and larger than $\frac 2 3 \lambda$; I mean, even if I take $n$ outcomes of the variable $X$, I should have infinite values lower than $\frac 1 3\lambda$ and larger than $\frac 2 3 \lambda$ as $n$ goes to infinity. Can you give me a clarification? Thanks a lot.
Exercise on almost sure convergence of random variables
probability theoryrandom variables
Best Answer
Suppose you have a non-random sequence $(x_n)$ such that $x_n \le \lambda / 3$ occurs for infinitely many $n$, and $x_n \ge 2\lambda_n / 3$ occurs for infinitely many $n$. You should then conclude that the sequence $(x_n)$ cannot converge.
For the same reason, each instance of the sequence $(X_n)$ of random variables cannot converge, so $(X_n)$ does not converge almost surely. (Remember that "$(X_n)$ converges almost surely" is "$\lim_{n \to \infty} X_n$ exists almost surely.") This is much stronger than convergence in distribution.
Response to comment:
I think your confusion is in not correctly understanding the notions of convergence in distribution and convergence almost surely. You should carefully check the definitions.
Specifically, when checking convergence of $(X_n)$ to $X$ in distribution, it suffices to check that the CDFs converge [at points of continuity of the limit CDF]. It does not matter how the $X_n$ depend on each other; you just consider the sequence of distributions separately. However when checking for convergence almost surely, you must consider each instance of the sequence $(X_n)$ and check if it is convergent as a non-random sequence. Here, the dependence between the random variables in the sequence matters very much.
In the example in your comment, you have not specified the sequence of random variables converging to $X \sim \text{Bernoulli}(1/2)$, so there is no way to say that "$0$ and $1$ will occur infinitely many times." Here are two silly examples where $(X_n)$ converges almost surely to $X$, but $0$ and $1$ do not appear infinitely many times.