Stopping time for a sped-up random walk

markov chainsrandomrandom walkstochastic-processesstopping-times

Let $X_0$ be random variable taking values in $\mathbb{N}$, with finite expectation and admitting finite exponential moments.

Consider a lazy random walk in $\mathbb{Z}$, $(S_n)_{n \in \mathbb{N}}$, which starts from $S_0 = X_0$ and it jumps to the left according to a jump distribution which depends on the position of the walk and it is such that,
$$
P(S_{n+1} = S_n – k ) = Binom(S_n, p,k),
$$

where $Binom(S_n,p,k)$ is the probability that a a random variable with Binomial distibution and parameters $S_n$ and $p$ equals $k$, where $p \in (0, 1)$ is some fixed parameter.

Let $\tau$ be the first time that the walk is in $(-\infty, 0].$
How to prove that there exist some constants $c_1, c_2 \in (0, \infty)$ such that, for any large enough $m$
$$
P(\sum\limits_{n=0}^{\tau} S_n > m ) \leq c_1 \exp( -c_2 \, \, m)?
$$

Suggestion:
I believe that the quantity should decay exponentially with respect to $m$ since the starting position of the walk, $X_0$, admits finite expectation and exponential moments and since I expect the sum $\sum\limits_{n=0}^{\tau} S_n$ is of order $O(1)$ as $m \rightarrow \infty$. However, I cannot turn such intuitions into a proof.

Best Answer

$B \sim Binom(m_0, p)$ where $ m_0 > m$

$Binom(a, p) + Binom(b, p) = Binom(a+b, p)$

https://en.wikipedia.org/wiki/Binomial_distribution#Sums_of_binomials

https://en.wikipedia.org/wiki/Binomial_distribution#Cumulative_distribution_function

https://en.wikipedia.org/wiki/Beta_function#Incomplete_beta_function

$P(\sum\limits_{n=0}^{\tau} S_n > m) = P(B > m) = I_{1-p}(m_0-m, m+1) \leq c_1 \exp( c_2 \, \, m)$

last step shouldn't be hard

Related Question