Doubts about Proof of Durrett Theorem 3.7.4. Thinning of Poisson Process

poisson distributionpoisson processprobability theory

I am having trouble understanding Durrett's logic in his proof of the thinning of the Poisson process.

Here is the statement of the Theorem: $N_j(t)$ are independent rate $\lambda P(Y_i = j)$ Poisson processes. $N(t)$ is assumed to be a Poisson process with rate $\lambda$ (e.g., the number of cars arrive at a store at time t), and $N_i(t)$ is defined as the number of $i \leq N(t)$ with $Y_i = j$, where $Y_i$ is an additionally defined property associated with the arrivals (e.g., the number of passengers in the arrived cars).

In Durrett's proof, he first proved a simple case where this additional property $Y_i$ is binary. He defined $P(Y_i = 1) = p$ and $P(Y_i = 2) = 1 – p$, which is fine.

What I am having problems with are:

(1) He asserted that $N_1(t)$ and $N_2(t)$ are Poisson processes without proving it.

This is what I do not understand – isn't it part of the statement to be proved that $N_1(t)$ and $N_2(t)$ are Poisson processes? Why is this true?

(2) He proved that if $X_i = N_i(t + s) – N_i(s)$, $X_1 = j$ and $X_2 = k$, there must be $j + k$ arrivals between $s$ and $s + t$, so $P(X_1 = j, X_2 = k) = e^{\lambda t} \frac{(\lambda t)^{j+k}}{(j+k)!}\frac{(j+k)!}{j!k!}p^j (1-p)^k = e^{\lambda pt} \frac{(\lambda p t)^j}{j!}e^{\lambda(1-p)}t \frac{(\lambda (1-p) t)^j}{j!}$. Then he asserted that $X_1 = Poisson(\lambda pt)$ and $X_2 = Poisson(\lambda (1-p)t)$.

My question is, why can you assert that because $P(X_1 = j, X_2 = k)$ factors into two Poisson distributions, so $X_1$ and $X_1$ are independent, $X_1$ must be $Poisson(\lambda pt)$ and $X_2$ must be $Poisson(\lambda (1-p)t)$?

Thank you very much.

Here is the statement and the full proof of the theorem: (the questions are highlighted in blue)
enter image description here
enter image description here

Best Answer

(1) You are right- it is premature to state that $N_1(t)$ and $N_2(t)$ are Poisson processes before proving it. But he does prove it later.

(2) If $X,Y$ are random variables taking values in $0,1,2,3, \ldots$, and $P(X=j, Y=k)=p(j)q(k)$ for all $k,j$ where $p(\cdot)$ and $q(\cdot)$ are probability distributions, then $$P(X=j)=\sum_k p(j)q(k)=p(j)\sum_k q(k) =p(j)$$ and similarly $$P(Y=k)=q(k)\,.$$

Related Question