On probabilities involving Poisson processes

markov-processpoisson distributionpoisson processprobabilitystochastic-processes

Question

Arrivals of Buses A and B at a particular bus stop form independent Poisson processes with rate parameters $\lambda_1$ and $\lambda_2$ respectively.

$(a)\quad$ What is the probability that exactly 4 buses (of type A and/or B) arrive in the time interval $[0, t]$?

$(b)\quad$ What is the probability that exactly 3 Bus Bs arrive while I am waiting for a Bus A?

$(c)\quad$ If half of the buses break down before they reach my stop, then what is the probability that not a single bus passes me in the time interval $[0, t]$?

Hints

$(i)\quad$ Recall that $$\int^{\infty}_0 x^{\alpha – 1} e^{-\beta x}\ \mathrm{d}x = \frac {\Gamma(\alpha)} {\beta^{\alpha}}.$$

$(ii)\quad$ In $(c)$, you may assume that thinning has occurred.

My working

$(a)$

Let $W$ be the random variable denoting the number of buses arriving in the interval $[0, t]$.

$$\implies W \sim \mathrm{Poisson}((\lambda_1 + \lambda_2)t)$$

$$\implies \mathbb{P}(W = 4) = \frac {[(\lambda_1 + \lambda_2)t]^4 e^{-(\lambda_1 + \lambda_2)t}} {4!}$$

$(b)$

Let $X$ and $Y$ be the random variables denoting the waiting time for Buses A and B respectively.

$$\implies X \sim \mathrm{Exponential}(\lambda_1)\ \mathrm{and}\ Y \sim \mathrm{Exponential}(\lambda_2)$$

$$\implies \mathbb{P}(Y < X) = \frac {\lambda_2} {\lambda_1 + \lambda_2}$$

Thus, the required probability is $$\left(\frac {\lambda_2} {\lambda_1 + \lambda_2}\right)^3.$$

$(c)$

I have learnt the concept of thinning, but am not sure how to relate it to $(c)$.


As I have only just covered Poisson processes, I would like to know if my answers to parts $(a)$ and $(b)$ are correct and any intuitive explanations as to what the solution for $(c)$ is would be greatly appreciated 🙂 Moreover, where does Hint $(i)$ come into play?

Best Answer

a) is correct.

c) is solved in the comments, you just get a Poisson process with different parameter (i.e. parameter $(\lambda_1+\lambda_2)/2$.)

Now on to the hardest exercise, exercise b).

The heuristic idea is that we want to have exactly $3$ $B$-busses to appear and then an $A$-bus to appear. The probability that a $B$-bus appears before an $A$-bus is, as you calculated, $$\frac{\lambda_2}{\lambda_1+\lambda_2}.$$ Similarly, the probability that an $A$-bus appears before the $4$-th $B$-bus appears is $$\frac{\lambda_1}{\lambda_1+\lambda_2}.$$ Therefore, the probability that we get exactly $3$ $B$-buses before the first $A$-bus is, by independence of the Poisson processes and memoryless-ness of each Poisson process, $$\left(\frac{\lambda_2}{\lambda_1+\lambda_2}\right)^3\left(\frac{\lambda_1}{\lambda_1+\lambda_2}\right).$$ This argument is heuristic only and so it is not a proof, but the comments suggest that one can make it rigorous using the strong Markov property. (I am too tired to do so now though; also there is a second answer to this problem with a different formalization of this heuristic argument.) Below I will make a different, very rigorous argument, using "brute-force".


  • Let $N^B_{]0,x]}$ be the random variable that denotes the number of buses of type $B$ that arrive in $]0,x]$.
  • Let (using Iverson brackets) for each $x\in]0,\infty[, \omega\in\Omega$, $$Y(\omega, x)\overset{\text{Def.}}=[N^B_{]0,x]}(\omega)=3].$$ The random variable $Y$ is equal to $1$ if and only if there arrive exactly $3$ buses in the time interval $]0,x]$.
  • Let $X(\omega)\overset{\text{Def.}}=\inf\{x\ge 0: N_{]0,x]}(\omega)=1\}$. It is the random variable which denotes the waiting time for the first type-$A$-bus that arrives.

Then the probability that $3$ type-$B$-buses arrive before the first type-$A$-bus arrives is $$\mathsf P(\{\omega\in\Omega:N^B_{]0,X(\omega)]}(\omega)=3\}).$$

Theorem. The before-mentioned probability equals the average probability that $3$ $B$-buses arrive before bus $A$ arrives at a given time $x$, the average being weighed over the distribution of the arrival time of bus $A$. More formally, $$\mathsf P(\{\omega\in\Omega:N^B_{]0,X(\omega)]}(\omega)=3\})=\int_{\mathbb R} \mathsf P(N_{]0,x]}=3)\,\mathrm dx.$$

While this Theorem is intuitively completely unsurprising, the proof (that I found) is astonishingly hard. I expect that the official solution to this problem does not spend many thoughts on proving the Theorem.

The proof of the Theorem is at the bottom. After having established the truth of this Theorem, the exercise becomes a straight-forward calculation task (for which one can use hint (i)).

Indeed, we compute, since $X$ has an exponential distribution with parameter $\lambda_1$ (exercise), $$\int_{\mathbb R}\mathsf P(N_{]0,x]}=3)\,X_\#\mathsf P(\mathrm dx)=\int_{0}^\infty \frac{\lambda_1(x\lambda_2)^3 \exp(-x(\lambda_1+\lambda_2))}{3!}\,\mathrm dx = \dots\text{calculations}\dots=\frac{\lambda_1\lambda_2^3}{(\lambda_1+\lambda_2)^4}.$$

This coincides precisely with the result obtained above.


Now to the proof of the Theorem.

First let me state a Lemma.

Lemma. The family of random variables $(Y(\cdot, x))_{x\in]0,\infty[}$ and the random variable $X$ are stochastically independent.

Proof of Lemma. Exercise.

Proof of Theorem. (Disclaimer: This proof may not be correct, since the disintegration argument is dubious. However, I am pretty sure that the argument can be saved.) The desired probability on the left-hand-side is equal to (why?) $$\int_\Omega Y(\omega, X(\omega))\,\mathsf P(\mathrm d\omega).$$

Now we notice, by the "Transformationssatz", (for the notation see Disintegration of pushforward of $(Y\circ (\operatorname{id}\times X))_\#\mathsf P$) $$\int_\Omega Y(\omega, X(\omega))\,\mathsf P(\mathrm d\omega)=\int_{\mathbb R} y\, (Y\circ(\operatorname{id}\times X))_\#\mathsf P(\mathrm dy) .$$

By Disintegration of pushforward of $(Y\circ (\operatorname{id}\times X))_\#\mathsf P$, we have the disintegration (see also disintegration Theorem) $$(Y\circ (\operatorname{id}\times X))_\#\mathsf P(\mathrm dy)=(\pi_2)_\#\big(X_\#\mathsf P(\mathrm dx)\otimes Y(\cdot, x)_\#\mathsf P(\mathrm dy)\big).$$ Therefore, $$\int_{\mathbb R} y\, (Y\circ(\operatorname{id}\times X))_\#\mathsf P(\mathrm dy)=\int_{\mathbb R}\left(\int_{\mathbb R}y\, Y(\cdot, x)_\#\mathsf P(\mathrm dy) \right)\,X_\#\mathsf P(\mathrm dx).$$ But, using the Transformationssatz again, we see that $$\int_{\mathbb R}y\, Y(\cdot, x)_\#\mathsf P(\mathrm dy)$$ is just the expected value of the random variable $Y(\cdot, x)$. But this is just (why?) $\mathsf P(\{\omega\in\Omega: N^B_{]0,x]}(\omega)=3\})$. $\square$