Probability – Approximation of a Random Sum of Random Variables by a Triangular Array

binomial distributionlevy-processeslimits-and-convergencepr.probabilityprobability distributions

We know that a Poisson distribution can be approximated by a binomial distribution. More exactly, let $(X_{jn})_{1\leq j \leq n}$ be a i.i.d. triangular array such that
$$P[X_{jn}= 1 ] = p_n = 1- P[X_{jn}=0]$$
and:

  1. $p_n \to 0$ as $n \to \infty$;
  2. $np_n \to \lambda$ as $n \to \infty$

So we have the following convergence in distribution:
$$S_n = \sum_{j=1}^n X_{jn} \overset{d}{\to} N= \sum_{j=1}^N 1, \quad N \sim \hbox{Poisson}(\lambda)$$
Thus, if we want approximate $N\sim \hbox{Poisson}(\lambda)$ by a binomial distribution, we can set $X_{jn} \sim \hbox{Bernoulli}(\lambda/n)$ or $S_n \sim \hbox{Binomial}(\lambda/n , n)$

Now, given $(\xi_j)_{j=1}^\infty$ a i.i.d. sequence of random variables independent of $N$. Consider $Y = \sum_{j=1}^N \xi_j$. Something tells me that I can find a sum $S_n$ that converges in distribution to $Y$:
$$S_n \overset{d}{\to} Y = \sum_{j=1}^N \xi_j, \quad N \sim \hbox{Poisson}(\lambda)$$
Since $Y$ is infinitely divisible, so it there is a triangular array $(X_{ij})$ such that $S_n$ converges in distribution to $Y$, But I think I should adjust some weights in the summations: $S_n= \sum_{j=1}^n w_j X_{jn}$, where $X_{jn} \sim \hbox{Bernoulli}(\lambda/n)$.

Is there any constructive way to express this sum $S_n$?

Best Answer

$\newcommand\la\lambda$Let $$S_n:=\sum_{j=1}^n\xi_j X_{j,n},$$ where the $\xi_j$'s are iid random variables (r.v.'s) and, for each $n$, the $X_{j,n}$'s are iid r.v.'s independent of $\xi_j$'s and such that each $X_{j,n}$ has the Bernoulli distribution with parameter $p_n$. Suppose that $n\to\infty$ and $np_n\to\la$ for some real $\la>0$.

Then \begin{equation*} S_n\to Y:=\sum_{j=1}^N\xi_j \tag{1}\label{1} \end{equation*} in distribution, where $N$ is a Poisson r.v. with parameter $\la$ independent of the $\xi_j$'s.

This follows easily by the method of characteristic functions: If $f(t):=Ee^{it\xi_1}$ for real $t$, then \begin{equation*} Ee^{itS_n}=(Ee^{it\xi_1 X_{1,n}})^n =(1-p_n+p_n f(t))^n\to e^{\la(f(t)-1)} \tag{2}\label{2} \end{equation*} and \begin{equation*} Ee^{itY}=\sum_{n=0}^\infty P(N=n)f(t)^n =\sum_{n=0}^\infty \frac{\la^n}{n!}\,e^{-\la}f(t)^n =e^{\la(f(t)-1)}, \tag{3}\label{3} \end{equation*} so that $Ee^{itS_n}\to Ee^{itY}$.

The result you quoted in the beginning of your post is the special case of \eqref{1} with $\xi_j=1$ for all $j$.


Details on \eqref{2}: \begin{equation*} \begin{aligned} Ee^{it\xi_1 X_{1,n}}&=Ee^{it\xi_1 X_{1,n}}\,1(X_{1,n}=0)+Ee^{it\xi_1 X_{1,n}}\,1(X_{1,n}=1) \\ &=E1(X_{1,n}=0)+Ee^{it\xi_1}\,1(X_{1,n}=1) \\ &=1-p_n+Ee^{it\xi_1}\,E1(X_{1,n}=1) \\ &=1-p_n+f(t) p_n. \end{aligned} \end{equation*} Here we used the equality $Ee^{it\xi_1}\,1(X_{1,n}=1)=Ee^{it\xi_1}\,E1(X_{1,n}=1)$, which holds because $\xi_1$ and $X_{1,n}$ are independent.

Details on \eqref{3}: \begin{equation*} \begin{aligned} Ee^{itY}&=\sum_{n=0}^\infty E1(N=n)e^{itY} \\ &=\sum_{n=0}^\infty E1(N=n)\exp\Big(it\sum_{j=1}^n\xi_j\Big) \\ &=\sum_{n=0}^\infty E1(N=n)\,E\exp\Big(it\sum_{j=1}^n\xi_j\Big) \\ &=\sum_{n=0}^\infty P(N=n)f(t)^n. \end{aligned} \end{equation*} Here we used the equality $E1(N=n)\,\exp\big(it\sum_{j=1}^n\xi_j\big)=E1(N=n)\,E\exp\big(it\sum_{j=1}^n\xi_j\big)$, which holds because the $\xi_j$'s and $N$ are independent.