Counterexample on two identical random variables with Poisson sum

poisson distributionprobabilityprobability theoryrandom variablesstatistics

Raikov's theorem states that if $X_{1}$ and $X_{2}$ are two independent random variables, such that $X_{1}+X_{2}$ has a Poisson distribution, then these two variables also have Poisson distribution.

Is the same true when $X_{1}$ and $X_{2}$ are not necessarily independent, but identically distributed and have a symmetric joint probability distribution, i.e., $\mathbb{P}(X_{1} = i, X_{2} = j) = \mathbb{P}(X_{1} = j, X_{2} = i)$ for all $i,j$?

Best Answer

The fact that $X_1$ and $X_2$ have a symmetric joint distribution implies that they are identically distributed because $$ P(X_1 = i) = \sum_{j \ge 0} P(X_1 = i, X_2 = j) = \sum_{j \ge 0} P(X_2 = i, X_1 = j) = P(X_2 = i). $$ If you see the joint distribution of $X_1$ and $X_2$ as a non-negative function on $\mathbb N^2$, then the distribution of $X_1$ is determined by adding up entire rows and the distribution of $X_2$ by adding up entire columns. The symmetry implies equality of those two distributions.

So if you see it that way, you basically have a positive function on $\mathbb N^2$ whose sum along diagonals $i+j = k$ for $k$ fixed is determined (by $e^{-\lambda} \lambda^k / k!$), but on this diagonal the only restriction is symmetry. One way to achieve this without having $X_1$ or $X_2$ follow Poisson distributions is to consider the following joint distribution: $$ \mathbb P(X_1 = i, X_2 = j) = \begin{cases} 0 & \text{ if } 0 < i,j < i+j \\ \frac{e^{-\lambda} \lambda^i}{2i!} & \text{ if } i = 0, j > 0 \\ \frac{e^{-\lambda} \lambda^j}{2j!} & \text{ if } j = 0, i > 0 \\ e^{-\lambda} & \text{ if } i=j=0. \end{cases} $$ This gives the following distribution for $X_1+ X_2$: $$ \forall k > 0, \quad \mathbb P(X_1+X_2 = k) = \underset{i+j=k}{\sum_{i,j \ge 0}} \mathbb P(X_1=i,X_2=j) \\ = \mathbb P(X_1=0, X_2=k) + \mathbb P(X_1=k, X_2=0) \\ = 2 \left( \frac{e^{-\lambda} \lambda^k}{2k!} \right) \\ = \frac{e^{-\lambda} \lambda^k}{k!} $$ and for $k = 0$, clearly $\mathbb P(X_1+X_2 = 0) = \mathbb P(X_1=0,X_2=0) = e^{-\lambda}$. This means $X_1+X_2 \sim \mathrm{Poi}(\lambda)$, and clearly the joint distribution of $X_1$ and $X_2$ is symmetric by construction, so in particular they are identically distributed by the above argument. But $$ \forall i > 0, \quad \mathbb P(X_1 = i) = \sum_{j \ge 0} \mathbb P(X_1=i, X_2 = j) = \mathbb P(X_1=i, X_2=0) = \frac{e^{-\lambda} \lambda^i}{2i!} $$ and for $i = 0$, $$ \mathbb P(X_1 = 0) = \sum_{j \ge 0} \mathbb P(X_1=0, X_2 = j) \\ = e^{-\lambda} + \frac 12 \left( \sum_{j \ge 1} \frac{e^{-\lambda} \lambda^j}{j!}\right) \\ = e^{-\lambda} + \frac 12 (1 - e^{-\lambda} ) \\ = \frac{1+e^{-\lambda}}2. $$

Clearly $X_1$ is not Poisson. You can also clearly see from this argument that the space of distributions $X_1$ satisfying $X_1 + X_2 = \mathrm{Poi}(\lambda)$ where $X_2$ has the same distribution as $X_1$ is an infinite-dimensional manifold with boundary since you can do whatever you want with the values $p_{ij}$ for $i,j \ge 0$, $i+j=k$ and $i \le \lfloor k/2 \rfloor$, as long as these values give you the corresponding required sum of $e^{-\lambda} \lambda^k/k!$.

Hope that helps,