Random walks and finiteness of a sum

probability theoryrandom walkstochastic-processes

Let $S_n = X_1 + \dots + X_n$, $S_0 = 0$, denote a random walk, where the i.i.d. increments satisfy $EX = 0$ and $EX^2 \in (0,\infty)$. I want to show that for any $x \geq 0$ $$ \sum_{n = 0}^\infty P(0 \leq S_n \leq x, \min_{0 \leq k \leq n} S_k \geq 0) < \infty.$$

It seems that it is not enough to just consider $\min_k S_k$, as it is well-known that $P( \min_{0 \leq k \leq n} S_k \geq 0 ) $ behaves like $n^{-1/2}$. Any ideas?

Note that in our case $\liminf_n S_n = – \infty$ and $\limsup_n S_n = + \infty$.

Best Answer

Let $A_n$ be the event "$0 \leq S_n \leq x$ and $\min_{0 \leq k \leq n} S_k \geq 0$" appearing in the question. Then the sum in question can be rewritten as $$ \lim_{t \to \infty} E[\#(n \leq t : A_n)]. $$

Choose $\epsilon > 0$ such that $P(X \leq -\epsilon) = \delta > 0$. Then if it ever happens that $S_n \leq x$, there is a positive probability that the random walk will dip below 0 in bounded time. Namely, setting $c = \lfloor x/\epsilon \rfloor + 1$, we have $$ P(S_{n+c} < 0) \geq P(X_{n+1}, \dots, X_{n+c} < -\epsilon) = \delta^c > 0. $$ It follows that conditional on $A_n$, there is a probability $\geq \delta^c$ that $A_m$ will never occur for $m \geq n+c$.

Now fix some $N \in \mathbb N$ (to avoid worrying about convergence issues), and let $k \in \mathbb N$ vary. We claim that $$ P(\#(n \leq N : A_n) \geq k \cdot c) \leq (1 - \delta^c)^k. $$ Proof: Induct on $k$, with the base case $k = 0$ being trivial. If $A_n$ has already occurred $k \cdot c$ times, the probability that it will occur at least $c$ more times is at most $1 - \delta^c$.

Since $(1 - \delta^c)^k$ decreases exponentially, we get a constant upper bound on $E[\#(n \leq N : A_n)]$. This doesn't depend on $N$, so it gives an upper bound on $\lim_{t \to \infty} E[\#(n \leq t : A_n)]$ as well.

Related Question