Probability – Asymptotic Properties of Weighted Random Walks and Infinite Convolutions

limits-and-convergencepr.probabilityrandom walksreal-analysisstochastic-processes

Let $(X_n)_{n\in\mathbb{N}}$ be a sequence of i.i.d. real-random variables. Let further $0<c<1$. I'm interested in the asymptotic properties of
$$
\sum_{k=1}^n c^k X_k.
$$

I can prove that this converges a.s. for $n\to\infty$ iff $\mathbb{E}(\max(0,\log(|X_1|)<\infty$.
To be more specific:

  • Are there known conditions for $\limsup\limits_{n\to\infty}\sum_{k=1}^n c^k X_k=\infty$ a.s.?
  • As far as I know, the asymptotic behaviour of $c^n\sum_{k=1}^nX_k$ is fairly well understood (c.f. "A note on Fellers strong law of large numbers" by Chow, and Zhang (1986)). Is there a known connection (or differences) between the asymptotic behaviour of $c^n\sum_{k=1}^nX_k$ and $\sum_{k=1}^n c^k X_k$? E.g. is it possible that $\lim\limits_{n\to\infty}c^n\sum_{k=1}^nX_k=\infty$ a.s., but $\lim\limits_{n\to\infty}\sum_{k=1}^n c^k X_k<\infty$ a.s.? I can only proof that if $\lim\limits_{n\to\infty}c^n\sum_{k=1}^nX_k=\infty \text{ a.s.} \Rightarrow \lim\limits_{n\to\infty}\sum_{k=1}^n c^k X_k$ diverges a.s.
  • I'm having trouble constructing examples, where $\lim\limits_{n\to\infty}\sum_{k=1}^n c^k X_k=\infty$ a.s. holds. Does anybody know a way to construct such examples?

I would be very grateful for any advice or reference!

Best Answer

Let $(X_n)_{n\in\mathbb{N}}$ be i.i.d. real random variables and let $0<c<1$.
Then the following are equivalent:

(a) There exists $r>0$ such that $P(|X_k|>e^{rk} \; \, \text{infinitely often} )=0$.

(b) $\mathbb{E}(\max(0,\log(|X_1|)<\infty$.

(c) For all $r>0$, we have $P(|X_k|>e^{rk} \; \, \text{infinitely often} )=0$.

(d) The series $ \sum_{k=1}^n c^k X_k $ converges a.s.

(e) $\lim _n c^n S_n =0 \;$ a.s., where $S_n:=\sum_{k=1}^n X_k$.

Proof: For a random variable $Y \ge 0$, we have $$\sum_{k=1}^\infty {\bf 1}_{\{Y \ge k\}} \le Y \le \sum_{k=0}^\infty {\bf 1}_{\{Y \ge k\}} \,,$$ so taking expectations gives the well-known inequalities $$\sum_{k=1}^\infty P(Y \ge k) \le \mathbb{E}(Y) \le \sum_{k=0}^\infty P(Y \ge k) \,.$$ Thus by the Borel-Cantelli lemma, if $Y_k \ge 0$ are i.i.d. and $r>0$, then $$\mathbb{E}(Y_1) <\infty \Longleftrightarrow \mathbb{E}( Y_1/r) <\infty \Longleftrightarrow $$ $$ \sum_k P(Y_k \ge {rk}) <\infty \Longleftrightarrow P(Y_k \ge {rk} \; \, \text{infinitely often} )=0 \, . \tag{*}$$

Applying this to $Y_k=\max(0,\log(|X_k|)$ shows that $ \;(a) \Longrightarrow(b) \Longrightarrow (c)$. Obviously $(c) \Longrightarrow (a)$.

Clearly $\; (c) \Longrightarrow (d) \Longrightarrow (a)$ and $\; (c) \Longrightarrow (e)$.

Finally, suppose (e) holds. Then $c^n X_n =(c^n S_n) - c(c^{n-1}S_{n-1}) \to 0$ a.s., and (a) follows.

QED

Related Question