I assume that the following scenarios are counted:
- two customers in 1pm to 2pm slot, none in the other slot
- no customers arrive in 1pm to 2pm slot, two arrive in 3pm to 4pm slot
- one customer in 1pm to 2pm slot, one customer in the other slot
These scenarios are mutually exclusive, so the probability is:
$$\begin{align}
\Pr(\text{exactly two})&=\frac{\Gamma^2 e^{-\Gamma}}{2!}\frac{\Gamma^0 e^{-\Gamma}}{0!}+\frac{\Gamma^0 e^{-\Gamma}}{0!}\frac{\Gamma^2 e^{-\Gamma}}{2!}+\frac{\Gamma^1 e^{-\Gamma}}{1!}\frac{\Gamma^1 e^{-\Gamma}}{1!} \\[2ex]
&=(\tfrac{1}{2}+\tfrac{1}{2}+1)\Gamma^2 e^{-2\Gamma} \\[2ex]
&=2\Gamma^2 e^{-2\Gamma} \\[2ex]
&=2\times7^2 e^{-14}
\end{align}$$
because there are $\Gamma=7$ expected arrivals per hour.
If my interpretation of the question is correct, your answer is off by a factor of 2.
Since Poissonian events are independent of each other and do not depend on when timing intervals start, the split periods can be treated as a single continuous two hour period, as André Nicolas does.
Suppose that $X$ is a Poisson random variable with parameter $\lambda$; i.e., $$\Pr[X = x] = e^{-\lambda} \frac{\lambda^x}{x!}, \quad x = 0, 1, 2, \ldots.$$ This random variable models the number of events occurring in a given fixed time period when the average rate of events in such a time period is $\lambda$.
Now, suppose we have an associated event, in the sense that whenever the original event occurs, the associated event has a probability $p$ of occurring at the same time; and the random variable $Y$ counts the random number of associated events in the same given time period as $X$. Or, perhaps the model is that there are events of two types, and that $X$ counts the number of events of either type, whereas $Y$ counts the number of events of the first type, where, given that an event is observed, the probability it is of the first type is $p$.
Then, the goal is to show that the unconditional distribution of $Y$ is also Poisson, but with rate $p \lambda$.
To this end, we note that the conditional distribution of $Y$ given $X$ is binomial with parameters $n = X$ and $p$, because given that we have observed $X$ events, the number of $Y$-type events among these is equivalent to a sum of IID Bernoulli trials on each of the observed events, with probability of success $p$; that is to say, $$\Pr[Y = y \mid X] = \binom{X}{y} p^y (1-p)^{X-y}, \quad y = 0, 1, \ldots, X.$$ This is the key observation.
Now we condition $Y$ on $X$:
$$\begin{align*} \Pr[Y = y]
&= \sum_{x=0}^\infty \Pr[Y = y \mid X = x]\Pr[X = x] \\
&= \sum_{x=y}^\infty \Pr[Y = y \mid X = x]e^{-\lambda} \frac{\lambda^x}{x!} \\
&= e^{-\lambda} p^y \sum_{x=y}^\infty \binom{x}{y} (1-p)^{x-y} \frac{\lambda^x}{x!} \\
&= e^{-\lambda} (p\lambda)^y \sum_{m=0}^\infty \binom{y+m}{y} \frac{((1-p)\lambda)^m}{(y+m)!} \\
&= e^{-\lambda} \frac{(p\lambda)^y}{y!} \sum_{m=0}^\infty \frac{((1-p)\lambda)^m}{m!} \\
&= e^{-p\lambda} \frac{(p\lambda)^y}{y!} \sum_{m=0}^\infty e^{-(1-p)\lambda} \frac{((1-p)\lambda)^m}{m!} \\
&= e^{-p\lambda} \frac{(p\lambda)^y}{y!}.
\end{align*}$$
Note here that we used the fact that the last sum in our derivation was the sum of a Poisson random variable with parameter $(1-p)\lambda$ over its support, and therefore equals $1$. And the final result we clearly recognize as a Poisson PMF with rate $p\lambda$, as claimed.
This proof demonstrates a phenomenon known as Poisson thinning.
Best Answer
Let $X \sim \mathsf{Binom}(20, .25)$ be the number of defective toys seen in a random sample of 20. You seek $P(X > 2) = 1 - P(X \le 2),$ which is easy to compute using the formula for the binomial PDF. [You'd need to find $P(X = 0)+P(X=1)+P(X=2)$ and subtract from $1.$]
In R statistical software (where
pbinom
is the binomial CDF), the answer is obtained asThis is not a situation for a good Poisson approximation. The Poisson rate is $\lambda = 5$ defects in 20. The figure below shows the binomial probabilities as bars. Probabilities for $\mathsf{Pois}(5)$ are shown as open circles.