Prove sum of i.i.d. random variables becomes arbitrarily large

probability

I have an unknown continuous distribution, density function $f_X$, with the following properties:

  • Zero mean
  • Symmetric around the mean
  • For all $a,b : 0 \leq a < b$ we have $f_X(a) > f_X(b)$

I'm looking at sums of IID draws from this distribution $Y_n = \sum_{i=1}^nX_i$ and would like to show that such sums get arbitrarily large with probability 1. Formally, I guess it would be that for all $t > 0$
$$\lim_{n=1}^{\infty}P\left(\bigvee_{k=1}^n|Y_k| > t\right) = 1$$
My first thought is to lower bound the disjunction by
$$P\left(\bigvee_{k=1}^n|Y_k| > t\right) > P\left(|Y_n| > t\right)$$
hoping that would be enough. Intuitively it feels correct, as the variance becomes unbounded. That is, if the variance of $f_X$ is $\sigma^2$, then the variance of $Y_n$ is $\sigma_{Y_n}^2 = n\sigma^2$. But all the probability inequalities I know of upper-bound tail-probabilities, they don't lower bound them. E.g. Chebyshev's inequality would be an upper-bound on the tail probability $P(|Y_n| > t)$. I know the second moment method is a thing, but it doesn't work in this case.

How might I approach this problem?

Best Answer

Symmetry and continuous density are not necessary to conclude. It is known (but also in Durrett, as exercises) that if $Y_n=\sum_{k\leq n}X_k$ is a random walk (i.e. sum of IID $(X_n)_{n \in \mathbb{N}}$) s.t. $E[X_1]=0$ and $P(X_1=0)<1$ (i.e. nondegenerate), then $-\infty = \liminf_{n\to \infty}Y_n< \limsup_{n\to \infty}Y_n =\infty$ a.s.

Related Question