[Math] Sum-Product of Random Variables

probabilityprobability distributionsprobability theory

Let $X_i$ be an iid sequence of random variables with support in $(0,1)$. I'm looking for references (even just a name) for the following infinite sum random variable:

$$S:=X_1+X_1X_2+X_1X_2X_3+X_1X_2X_3X_4+\cdots.$$

This came up for a waiting-time problem. I can easily calculate the expected value and variance of the above sum, but I'm interested if other people have studied this in literature, specifically if there are asymptotics for limiting distribution. I presume there are also issues when $P(X>1-\epsilon)$ falls off too slowly with increasing $\epsilon$.

Best Answer

Let $\Sigma_1=\sum_{i=1}^{\infty}{(\prod_{j=1}^{j}{X_i})}$ and notice that $\Sigma_1=X_1(1+\sum_{i=2}^{\infty}{(\prod_{j=2}^{j}{X_i})})=X_1(1+\Sigma_2)$

Also note that $\Sigma_1$ and $\Sigma_2$ are also iid (can be seen via dummy index changes).

Let us also assume that $\Sigma_i$ has the pdf $f_{\Sigma}(\sigma)$

Define $Y=(1+\Sigma_2)$, $f_Y(y)=f_{\Sigma}(y-1)$

Anyway since $\Sigma_2$ does not contain $X_1$, they are also independent.

Finally we can write down:

$f_{\Sigma}(\sigma)=\int_{\mathscr{Y}}{f_X(\sigma/y)f_Y(y)dy}=\int_{x=0}^{1}{f_X(x)f_Y(\sigma/x)dx}=\int_{x=0}^{1}{f_X(x)f_{\Sigma}(\sigma/x-1)dx}$

For an arbitrary $f_X(x)$ this cannot be solved, even existence of $f_{\Sigma}$ cannot be guaranteed.

One trivial solution for $(X,\Sigma)$ pair is satisfied by $f_X(t)=f_\Sigma(t)=\delta(t)$

One trivial $X$ implying that $\Sigma$ does not have a (not heavy tailed) pdf is: $f_X(t)=\delta(t-1)$

So if you can give us $f_X(x)$, we can talk again about the solution. If not for all I care I have given you a solution and an answer.

Related Question