Self Normalized Sum of Products of i.i.d. Random Variables

limit-theoremspr.probabilityprobability distributions

Let $p\in (0,1)$ and $X_1, X_2, …X_n \sim \text{Bern}(p)$ be $n$ i.i.d. Bernoulli random variables, where the probability that $X_i$ is $1$ equals $p$.

Fix $a,b>0$ different from $1$ that satisfy $a^p b^{1-p} = 1$, and define $C_i = X_i(a-b)+b$. In other words, $C_i$ is $a$ when $X_i$ is $1$, and $b$ when $X_i$ is $0$.

I am interested in the behavior of the random variables
$$Z_n = \frac{\sum_{i = 1}^n\left(\prod_{j = 1}^iC_j\right)X_i}{\sum_{i = 1}^n\prod_{j = 1}^i C_j} $$
as $n\to \infty$. Does $Z_n$ converge a.s.? Does $Z_n$ converge in distribution?

Note that the product of the $C_j$'s is sometimes very small and sometimes very large, as the central limit theorem says that $\frac{1}{\sqrt n} \sum \log C_i$ converges in distribution to a normal $\mathcal N (0, \sigma^2)$. This follows from the condition that $a^pb^{1-p} = 1$.

I ran some numerical experiments in Mathematica that suggest that $Z_n$ does converge to a constant, but this constant (presumably the limit of the means of $Z_n$) is a non-trivial function of $a$, $b$, and $p$. Indeed, the mean of $Z_n$ is difficult to compute, and does not seem to simplify nicely.

I am not a probabilist, so any resource that deals with this kind of random variable would be helpful.

Best Answer

James, using the fact that $X_i$ only takes on 2 values, write $C_iX_i = xC_i + y$ . Then your numerator is $$\sum^n x \prod^j C_i + \sum^{n-1} y \prod^j C_i = \sum^n (x+y) \prod^j C_i - yC_n$$. It suffices to show that $$ ~\frac {\prod^n C_i} {\sum^n \prod^j C_i } \rightarrow 0$$, where I am going to show the convergence in probability. $$$$ Lemma: if $S_n$ is a mean 0 random walk, $\frac {e^{S_n}} {\sum^n e^{S_i}} \rightarrow 0$ in probability. $$$$Proof: $$\frac {e^{S_n}} {\sum^n e^{S_i}} = \frac 1 {1 + e^{-X_n} + e^{-X_n - X_{n-1}} + .... + e^{-X_n - X_{n-1} - ... - X_1}}$$. In the exponents in the denominator is the time reversed random walk, which is also a mean 0 random walk, and it pretty obviously goes to 0 in probability, e.g., break it up into returns to 0.$$$$ So the conclusion is, your expression is converging in probability to x+y. To do this explicitly, $X = (C - b)/(a-b)$, $XC = aX = a(C-b)/(a-b)$ and I seem to be claiming that the limit is a(1-b)/(a-b).

Related Question