The discussion in your book is not phrased correctly in some aspects, but first let me address your question about conditioning on an event of probability $0$; something that is explicitly forbidden in the definition of conditional probability in the earlier chapter of your book.
For jointly continuous random variables $X$ and $Y$ with joint pdf $f_{X,Y}(u,v)$, the conditional pdf of $Y$ given that $X = x$ is defined to be
$$f_{Y\mid X}(v\mid X = u) = \begin{cases}
\displaystyle \frac{f_{X,Y}(u,v)}{f_{X}(u)}, & \text{if }~f_{X}(u)>0,\\0, &\text{otherwise.}\end{cases}$$ where $f_X(u)$ is the (marginal) pdf
of $X$. The conditional complementary CDF is
$$1-F_{Y\mid X}(t\mid X = u) = P\{Y > t\mid X = u\} =
\int_t^\infty f_{Y\mid X}(v\mid X = u) \,\mathrm dv$$
Now, in your application, $P\{X_2 > t\mid X_1 = s\}$ can be
calculated directly since we are told that the first arrival occurred
at $s$ and are being asked for the conditional probability that no arrivals have occurred in $(s,s+t]$. But, what happens in $(s,s+t]$ is independent of what happened in $(0,s]$ since the time intervals are
disjoint. That is, $P\{\text{no arrivals in} ~ (s,s+t]\mid X_1=s\}$ is the same regardless of whether we assume that there was an arrival at $s$ or the first arrival occurred before time $s$, and so
$$P\{X_2 > t\mid X_1 = s\} = P\{\text{no arrivals in} ~ (s,s+t]\}
= e^{-\lambda t}.$$ and thus we get that the conditional pdf
$f_{X_2\mid X_1 = s}(v\mid X_1 = s)$ is the same as the unconditional pdf $f_{X_2}(v) = \lambda e^{-\lambda v}, v > 0$. Conditionally or unconditionally, the distribution of $X_2$ is exponential with parameter $\lambda$. Furthermore,
\begin{align}
f_{X_2}(v) = f_{X_2\mid X_1 = s}(v\mid X_1 = s)
= \displaystyle \frac{f_{X_1,X_2}(s,v)}{f_{X_1}(s)}
\implies f_{X_1,X_2}(s,v) = f_{X_1}(s)f_{X_2}(v)
\end{align}
showing that $X_1$ and $X_2$ are independent (exponential random variables
with parameter $\lambda$).
The answers to our specific questions are hidden somewhere in the above.
The likelihood function of the Poisson given observations $x_1, x_2, \ldots, x_n$ is
$$ l(\lambda; x) = \prod_i e^{-\lambda}\frac{\lambda^{x_i}}{x_i!} = \frac{e^{-n\lambda}}{x_1!x_2!\cdots x_n!}\lambda^{x_1 + x_2 + \cdots + x_n}$$
If $x_1 = x_2 = \cdots = x_n = 0$ then this becomes
$$ l(\lambda; x) = e^{- n \lambda} $$
Which is maximized when $\lambda = 0$.
So the MLE does exist in this case, it is $\lambda = 0$.
Best Answer
"IID" stands for "independent, identically distributed", but the "identically distributed" part isn't really that important here (outside of convenience), at least in the sense that one could ask a similar question with different $\lambda$'s.
However, the independent part is critical, because it allows you to apply the result that a sum of independent Poisson random variables has a Poisson distribution, with mean equal to the sum of the component means. (When you don't have independence you don't generally have that a sum of Poissons is Poisson; an obvious exception is the case where all the $X_i$'s are equal.)
So rather than throwing you off, it makes the problem much easier than if the variables were dependent.
You can see that independence makes a difference by also considering the case where $X_1=X_2=X_3=X_4$.
[There's another neat trick that makes answering easier still. You're asked for $P(Y<2)$, but $Y=\overline{X}$, so $Y$ is not Poisson. The trick is to convert the event into an equivalent event that is easy to work with, given the abovementioned result.]