So, does it make sense to measure inter-arrival times and build the distribution with those samples?
In a sense, yes. A Poisson process can be thought of as something that gives you the number of events within a given "window" of time. So you may think of that window as starting the moment that the previous event occurred.
The way I prefer to look at Poisson processes is by using the more general Markov model:
$$ p(0,t+dt) = p(0,t)\left[1-f(0,t)dt\right] $$
$$ p(r,t+dt) = p(r,t)\left[1-f(r,t)dt\right]+p(r-1,t)f(r-1,t)dt,\ r = 1,2,3,\ldots $$
where $r$ represents the number of events, $p$ represents the probability, and $t$ represents time. (This representation is pulled from Statistical Analysis of Random Dispersion by A. Rogers).
In the above representation, $f(r,t)$ represents the transition rate function. If we take the limit of the above as $dt\to 0$, then we can obtain the probability generating function of the process. And if we set $f(r,t) = \lambda a$, where $\lambda a$ is constant, then we have obtained a standard, completely random Poisson process. Changing $f(r,t)=c+br$ allows us to generate binomial or negative binomial distributions (depending on the sign of $b$) quite easily.
All that is dandy, but what is relevant is that the probability $p$, for a stationary Poisson process (so $f(r,t) = f(r)$, i.e. the transition rate is not dependent on time), is only dependent on the time interval $dt$, and not on the absolute time $t$ at all. And in the case of a true Poisson distribution, the probability is only dependent on the constant intensity term $\lambda a$, and not on the number of prior events $r$ at all.
Your reasoning is correct as well as your answer for 1) but not for 2), due to an incorrect value for $\mathbb P(Y=1)$. Perhaps a simpler way of solving 2) is to calculate the complementary probability:
$$
\mathbb P(Y=0)=\frac{5}{6}\cdot\frac{5}{6}=\frac{25}{36},\qquad \mathbb P(Y\geq 1)=1-\mathbb P(Y=0)=\frac{11}{36}.
$$
Now comparing with your method, we can see that $\mathbb P(Y=1)$ does not equal $\frac16$ as you wrote, but rather $\frac{10}{36}$. How to see it directly? Call the arrival times $T_1$ and $T_2$. Then either $T_1$ falls in the last 10 minutes and $T_2$ falls in the first 50 minutes, or vice versa. Thus
$$
\mathbb P(Y=1)=\frac{1}{6}\cdot \frac56 + \frac56\cdot\frac16=\frac{10}{36}.
$$
Best Answer
Let $X(t)$ be the number of customers in the restaurant at time $t$, assuming $X(0)=0$. This is a $M/M/\infty$ queueing model - interarrival and service times are each i.i.d. exponential and there is no limit to how many customers can be served at a time. The process $\{X(t):t\geqslant 0\}$ is a continuous-time Markov chain with generator $Q$ given by $$ Q_{i,j} = \begin{cases} \lambda,& j=i+1\\ i\mu,& j=i-1\\ -(\lambda+i\mu),& j=i, \end{cases} $$ for nonnegative integers $i,j$. The distribution of $X(t)$ is Poisson with rate $$ m(t) = \int_0^t \lambda(s)(1-G(t-s))\ \mathsf ds, $$ where $\lambda$ is the arrival rate and $G$ the distribution function of the service times. To see this, fix $t\geqslant 0$, and consider an arrival at time $u$ where $0\leqslant s\leqslant t$. The customer is in the system at time $t$ if its service time is more than $t-s$, which has probability $1-G(t-s)$. We compute $$ m(t) = \int_0^t \lambda e^{-\mu(t-s)}\ \mathsf ds = \lambda e^{-\mu t}\int_0^t e^{-\mu s}\ \mathsf ds = \frac\lambda\mu\left(1-e^{-\mu t}\right), $$ and since $X(t)$ has a Poisson distribution, its mean and variance are also $m(t)$.
In particular, when $\lambda = 300$, $\mu=\frac32$, and $t=2$, we have $$ m(t) = 300\cdot\frac23\left(1-e^{-\frac32\cdot 2} \right) = 200\left(1-e^{-3}\right)\approx 190.043. $$