The Poisson with paramteter $\lambda$ has mean $\lambda$. So when we are approximating a binomial with parameters $n$, $p$ (and therefore mean $np$) by a Poisson, the appropriate parameter $\lambda$ is the mean $np$ of the binomial. But the above can mainly be thought of as a mnemonic, a device to remember the right answer. So we go into more detail.
The following is an informal calculation that can be turned into a formal limit argument. If $X$ is the binomial, then
$$\Pr(X=k)=\binom{n}{k}p^k(1-p)^{n-k}.\tag{1}$$
Use the abbreviation $\lambda$ for $np$. Then $p=\frac{\lambda}{n}$. So we have
$1-p=1-\frac{\lambda}{n}$.
We can then rewrite (1) as
$$\Pr(X=k)=\frac{1}{k!}(n)(n-1)\cdots(n-k+1)\frac{\lambda^k}{n^k}\left(1-\frac{\lambda}{n}\right)^n \left(1-\frac{\lambda}{n}\right)^{-k}.\tag{2}$$
Note that we have done nothing so far, except for the introduction of $\lambda$ as an abbreviation for $np$.
Now imagine $n$ large, and $p$ small, so that $np=\lambda$ stays constant. Let $k$ be fixed. We look at the various terms in Formula (2).
For fixed $k$, if $n$ is large then $(n)(n-1)\cdots(n-k+1)\frac{1}{n^k}\approx 1$ and $\left(1-\frac{\lambda}{n}\right)^{-k}\approx 1$. Also, for $n$ large, $\left(1-\frac{\lambda}{n}\right)^n \approx e^{-\lambda}$.
It follows that the right-hand side of (2) is approximately
$$\frac{1}{k!} \lambda^ke^{-\lambda}.$$
This is precisely the probability that a Poisson with parameter $\lambda$ takes on the value $k$.
In your example, the mean number of rotten apples in a batch of $100$ is $3$, and $n=100$ is reasonably large, so one can expect the Poisson with parameter $3$ to provide a good fit, at least for reasonable $k$.
Remark: We gave a fair bit of detail justifying why in the case $n$ large, $p$ small, and $np=\lambda$ moderate, we can approximate binomial distribution probabilities by Poisson distribution probabilities. But the main answer to your question is that the appropriate $\lambda$ is always the mean of your binomial. If a superbushel holds $150$ apples, the appropriate $\lambda$ is $4.5$.
Explanation based on DeGroot, second edition, page 256. Consider the binomial distribution with fixed $p$
$$
P(X = k) = {n \choose k}p^k(1-p)^{n-k}
$$
Now define $\lambda = np$ and thus $p = \frac{\lambda}{n}$.
$$
\begin{align}
P(X = k) &= {n \choose k}p^k(1-p)^{n-k}\\
&=\frac{n(n-1)(n-2)\cdots(n-k+1)}{k!}\frac{\lambda^k}{n^k}\left(1-\frac{\lambda}{n}\right)^{n-k}\\
&=\frac{\lambda^k}{k!}\frac{n}{n}\cdot\frac{n-1}{n}\cdots\frac{n-k+1}{n}\left(1-\frac{\lambda}{n}\right)^n\left(1-\frac{\lambda}{n}\right)^{-k}
\end{align}
$$
Let $n \to \infty$ and $p \to 0$ so $np$ remains constant and equal to $\lambda$.
Now
$$
\lim_{n \to \infty}\frac{n}{n}\cdot\frac{n-1}{n}\cdots\frac{n-k+1}{n}\left(1-\frac{\lambda}{n}\right)^{-k} = 1
$$
since in all the fractions, $n$ climbs at the same rate in the numerator and the denominator and the last parentheses has the fraction going to $0$. Furthermore
$$
\lim_{n \to \infty}\left(1-\frac{\lambda}{n}\right)^n = e^{-\lambda}
$$
so under our definitions
$$
\lim_{n \to \infty} = {n \choose k}p^k(1-p)^{n-k} = \frac{\lambda^k}{k!}e^{-\lambda}
$$
In other words, as the probability of success becomes a rate applied to a continuum, as opposed to discrete selections, the binomial becomes the Poisson.
Update with key point from comments
Think about a Poisson process. It really is, in a sense, looking at very, very small intervals of time and seeing if something happened. The "very, very, small" comes from the need that we really only see at most one instance per interval. So what we have is pretty much an infinite sum of infinite Bernoullis. When we have a finite sum of finite Bernoullis, that is binomial. When it is infinite, but with finite probability $np=λ$, it is Poisson.
Best Answer
You are not 'deriving' the Poisson distribution from the binomial distribution. In some circumstances (especially when binomial $p$ is small and binomial $n$ is large) it is possible to find a Poisson distribution that approximates the a binomial distribution. Such an approximation works best if the Poisson mean $\lambda$ matches the binomial mean $np.$
Here is a table, made using R, comparing selected probabilities for $\mathsf{Binom}(n = 200, p = 0.1)$ and $\mathsf{Pois}(\lambda = 20).$
(You can ignore line numbers in
[ ]
s.)Table:
In the following graphical comparison, differences in probabilities less than 0.003 are difficult to distinguish.
Note: Binomial probabilities $X\sim\mathsf{Binom}(n,p)$ have $P(X > n) = 0.$ Technically, Poisson probabilities never reach $0,$ no matter how far into the right tail you go. But for practical purposes the probabilities become too small to be of importance beyond some finite point. In particular, for $Y \sim\mathsf{Pois}(\lambda= 20),$ we have $P(Y > 40) \approx 0.$ Markov's inequality does not give 'tight' bound, but does guarantee $P(Y \ge k\lambda) \le \lambda/k\lambda = 1/k.$