[Math] Almost surely convergence uniform distribution

probability theoryprobability-limit-theorems

I have a question about conclusion of a problem. For a sequence $(X_k)_{k\in \mathbb{N}}$ of independent random variables with uniform distribution on $[-1,1]$, for all $k\in \mathbb{N}$ and $Y_n = \frac{1}{n}\sum_\limits{k=1}^n \sqrt{k} X_k$, show that that $Y_n \to 0$ almost surrely.

I tried to use the Kolmogorov's Theorem for the conclusion of Strong law of large numbers, but the series $\sum_\limits{k=1}^\infty \frac{D^2(\sqrt{k}X_k)}{k^2} = \sum_\limits{k=1}^\infty \frac{kD^2(X_k)}{k^2} = \sum_\limits{k=1}^\infty \frac{1}{3k}$ diverges. So, I can't use this Theorem. Are there other similar results with Kolmogorov's Theorem? or maybe the conclusion of the problem is wrong.

At the next point of the problem I must to compute: $\lim_\limits{n\to\infty} [\frac{n^n}{\sqrt{n!}}\sin{\frac{\sqrt{1}}{n}}\sin{\frac{\sqrt{2}}{n}}\dots\sin{\frac{\sqrt{n}}{n}}]$. I used characteristic function to solve it, but I want to know if I need a.s. convergence, or is enough convergence in probability(i.e. WLLN) to pass to convergence in characteristic function.

Best Answer

The conclusion that you would prove is not true. The reason is the following. Let $X_{nk}=\frac{\sqrt{k}X_k}{n}$, $1\le k\le n$, then $\mathbb{E}X_{nk}=0$, \begin{gather} \sum_{k=1}^n\mathbb{D}(X_{nk})=\sum_{k=1}^n\frac{k}{3n^2}\to \frac16,\qquad \text{as $n\to\infty$, }\\ \max_{1\le k\le n}|X_{nk}|\le \frac1{\sqrt{n}}\to 0.\qquad\text{as $n\to\infty$, } \end{gather} Hence using Lévy-Lindeberg Theorem, $$ \text{d-}\lim_{n\to\infty}Y_n=\text{d-}\lim_{n\to\infty}\sum_{k=1}^nX_{nk} \overset{dist.}{=}N(0,1/6).$$ This also means that ''$Y_n\to 0$ a.s.'' is not true. But you could prove that $$ \frac{Y_n}{\log\log n}\to 0,\qquad \text{a.s.}$$