Once you've shown that $n^{-1} S_n$ has the same distribution as $X_1$, then it's pretty much immediate that $n^{-\gamma} S_n$ does not convergence in distribution for $\gamma < 1$. If $\gamma > 1$ the characteristic function argument shows that $n^{- \gamma} S_n \stackrel{\text{d}}{\to} 0$. So $n^{-\gamma} S_n$ converges in distribution for $\gamma \geq 1$.
For the second part, since $\text{E}(|X_1| / k) = \infty$ for every $k \in \mathbb{N}$,
\begin{align}
\sum_{n=1}^{\infty} P(|X_n| / k > n) &= \sum_{n=1}^{\infty} P(|X_n| / n > k) \\
&= \infty ,
\end{align}
which by Borel-Cantelli and independence implies $P(|X_n| / n > k \,\, \text{i.o.}) = 1$ for every $k$. Therefore $\limsup_{n \to \infty} |X_n| / n \stackrel{\text{a.s.}}{=} \infty$, which in turn implies $\limsup_{n \to \infty} |S_n| / n \stackrel{\text{a.s.}}{=} \infty$, so $n^{-1} S_n$ does not converge almost surely.
I came up with some solutions, although they might be somewhat more complicated than necessary!
- Let $\epsilon>0$. As mentioned, according to the standard CLT, $S_n/\sqrt{n} \stackrel{D}{\to}N(0,1)$. Using this fact, choose $0<b$ so that, for all $n\ge N_0$, $P(S_n/\sqrt{n} > b ) < \epsilon$. Now, the probability of interest can be decomposed into two terms:
$P(S_n = k^2,\;\;\;\mbox{ for some k})=P(S_n = k^2,\;\;\;\mbox{ for some k}, \; S_n<b\sqrt{n})+P(S_n = k^2,\;\;\;\mbox{ for some k}, \; S_n>b\sqrt{n}).$
for $n>N_0$, the second term on the right hand side of the above equation is less than $\epsilon$.
Regarding the first term, note that the number of possible values $k^2$ such that $0 < k^2 < b\sqrt{n}$ is no more than $\sqrt{b}n^{1/4}$. The most likely single value that the random variable $S_n$ takes is zero, and $P(S_n=0) ={n \choose n/2}(1/2)^n.$ Using Stirlings formula to approximate the factorials in ${n \choose n/2}$ gives ${n \choose n/2} \le C2^n/\sqrt{n}$, which in turn gives $P(S_n=0) \le C1/\sqrt{n}$, where $C$ is an absolute constant. Since this is the most likely value
$P(S_n = k^2,\;\;\;\mbox{ for some k}, \; a\sqrt{n}<S_n<b\sqrt{n}) \le \mbox{ [number of terms]$\times$ [largest possible probability] }= \frac{C\sqrt{b}}{n^{1/4}}< \epsilon$
For $n$ sufficiently large, say $n \ge N_1$. Thus for $n \ge \max \{N_0,N_1\}$, $P(S_n = k^2,\;\;\;\mbox{ for some k}) < 2\epsilon$, completing the proof.
- We want to evaluate
$\lim_{n\to \infty} \frac{\log P(S_n/n > t)}{n}$
As $-n \le S_n \le n$, the limit is clearly not defined if $t>1$, and is always zero if $t<0$, so the interesting case is $t\in(0,1)$. Notice that $S_n/2 \stackrel{D}{=} B_n - n/2$, where $B_n$ is a Binomial random variable with parameters $n$ and $1/2$.
Two ingredients I use here are:
a) Hoeffdings inequality: $P( B_n > (t+1/2)n) \le e^{-2t^2n}$
b) Tail bounds for the normal distribution: If $Z\sim N(0,1)$, $(\frac{1}{\sqrt{2 \pi}t}-\frac{1}{\sqrt{2 \pi}t^3})e^{-t^2/2} \le P(Z>t) \le \frac{1}{\sqrt{2 \pi}t}e^{-t^2/2}$
Now, by multiplying and dividing by $P(Z > t\sqrt{n})$ inside the logarithm,
$\frac{\log P(S_n/n > t)}{n}= \frac{1}{n}\log \left(\frac{P(S_n/n > t)}{P(Z > \sqrt{n}t)}\right) + \frac{\log P(Z> t\sqrt{n})}{n}.$
Note that $P(S_n/n > t)= P(B_n > (t/2 + 1/2)n) \le exp(-t^2n/2)$, using Hoeffdings inequality. With this and the tail bound for the normal distribution,
$
\frac{P(S_n/n > t)}{P(Z > \sqrt{n}t)} \le \frac{exp(-t^2n/2)}{({1}/{\sqrt{2 \pi}t\sqrt{n}}-{1}/{\sqrt{2 \pi}(t\sqrt{n})^3})exp(-t^2n/2) } = \frac{1}{({1}/{\sqrt{2 \pi}t\sqrt{n}}-{1}/{\sqrt{2 \pi}(t\sqrt{n})^3}) }.
$
Elementary arguments show that
$\frac{1}{n}\log\left(\frac{1}{({1}/{\sqrt{2 \pi}t\sqrt{n}}-{1}/{\sqrt{2 \pi}(t\sqrt{n})^3}) }\right) \to 0$ as $n\to \infty$, and so the limit of interest is governed by $\lim_{n\to \infty} \frac{\log P(Z> t\sqrt{n})}{n}.$ This limit is $-t^2/2$, which can be arrived at using L'Hopitals rule, or the same tail bounds for the normal distribution metioned above. Hence, $\lim_{n\to \infty} \frac{\log P(S_n/n > t)}{n}=-t^2/2,$ $t\in (0,1)$.
Best Answer
Define $Z_n:=\frac{S_{2n}}{2n}-\frac{S_n}n$. Then $$A_n:=\frac{S_{2n}}{2n}+Z_n$$ is independent of $B_n:=\frac{S_n}n$. We assume that $B_n\to Y$ in probability. Then $A_n-B_n\to 0$ in probability, and denoting $\varphi$ the characteristic function of $Y$, we should have $\varphi(t)\varphi(-t)=1$ for each real number $t$. Since $Y$ is necessarily a Cauchy random variable, we get a contradiction.
There is a necessary and sufficient condition for the weak law of large numbers for independent sequences, namely, if $(X_n,n\geqslant 1)$ is an i.i.d. sequence and $\varphi$ is the characteristic function of $X_1$, the NSC is $\varphi'(0)$ exists.
Here, we can't use the law of large numbers because we know in advance that the limit would not be constant, but Cauchy.