Suppose that $X$ is a Poisson random variable with parameter $\lambda$; i.e., $$\Pr[X = x] = e^{-\lambda} \frac{\lambda^x}{x!}, \quad x = 0, 1, 2, \ldots.$$ This random variable models the number of events occurring in a given fixed time period when the average rate of events in such a time period is $\lambda$.
Now, suppose we have an associated event, in the sense that whenever the original event occurs, the associated event has a probability $p$ of occurring at the same time; and the random variable $Y$ counts the random number of associated events in the same given time period as $X$. Or, perhaps the model is that there are events of two types, and that $X$ counts the number of events of either type, whereas $Y$ counts the number of events of the first type, where, given that an event is observed, the probability it is of the first type is $p$.
Then, the goal is to show that the unconditional distribution of $Y$ is also Poisson, but with rate $p \lambda$.
To this end, we note that the conditional distribution of $Y$ given $X$ is binomial with parameters $n = X$ and $p$, because given that we have observed $X$ events, the number of $Y$-type events among these is equivalent to a sum of IID Bernoulli trials on each of the observed events, with probability of success $p$; that is to say, $$\Pr[Y = y \mid X] = \binom{X}{y} p^y (1-p)^{X-y}, \quad y = 0, 1, \ldots, X.$$ This is the key observation.
Now we condition $Y$ on $X$:
$$\begin{align*} \Pr[Y = y]
&= \sum_{x=0}^\infty \Pr[Y = y \mid X = x]\Pr[X = x] \\
&= \sum_{x=y}^\infty \Pr[Y = y \mid X = x]e^{-\lambda} \frac{\lambda^x}{x!} \\
&= e^{-\lambda} p^y \sum_{x=y}^\infty \binom{x}{y} (1-p)^{x-y} \frac{\lambda^x}{x!} \\
&= e^{-\lambda} (p\lambda)^y \sum_{m=0}^\infty \binom{y+m}{y} \frac{((1-p)\lambda)^m}{(y+m)!} \\
&= e^{-\lambda} \frac{(p\lambda)^y}{y!} \sum_{m=0}^\infty \frac{((1-p)\lambda)^m}{m!} \\
&= e^{-p\lambda} \frac{(p\lambda)^y}{y!} \sum_{m=0}^\infty e^{-(1-p)\lambda} \frac{((1-p)\lambda)^m}{m!} \\
&= e^{-p\lambda} \frac{(p\lambda)^y}{y!}.
\end{align*}$$
Note here that we used the fact that the last sum in our derivation was the sum of a Poisson random variable with parameter $(1-p)\lambda$ over its support, and therefore equals $1$. And the final result we clearly recognize as a Poisson PMF with rate $p\lambda$, as claimed.
This proof demonstrates a phenomenon known as Poisson thinning.
The Poisson process can be interpreted as the process that counts the total number of events that have occurred when the waiting times between events are i.i.d. exponential.
Hence, if you find the holding/waiting times between some random events are i.i.d. and distributed exponentially with parameter $\lambda$, then the total number of events will be distributed as a Poisson distribution with parameter $\lambda t$.
Best Answer
There are different types of continuous extensions of Poisson distribution.
I favor the one presented in this paper:
"Continuous counterparts of Poisson and binomial distributions and their properties" by Andrii Ilienko.
They are based on integral representations of Poisson and binomial probability distribution functions using complete and incomplete (Euler) $\Gamma$ and $\operatorname{B}$ functions.