I would approach the problem as follows. Assume that $t \ge 0$. First find the probability that the service time for A.J.'s car is $\le t$. Of course if this happens, then A.J. will beat M.J. If the first car is not serviced by time $t$, then by the memorylessness of the exponential, the probability that A.J. finishes before M.J. is $1/2$. Now you can put the pieces together.
Or else more simply do the same calculation from M.J.'s point of view. For M.J. to "win," we need first of all to have A.J.'s service time be $\ge t$. If that happens, the probability that M.J. wins is $1/2$.
The fact that this involves "convolutions" and sums of i.i.d. random variables makes me think of trying to deduce the distribution from Moment generating functions. Using the independence of the $\xi_i$, we have (for $t<\lambda$),
$$
\begin{eqnarray*}
\mathbb{E}\left[e^{t\sum\limits_{n=1}^{\nu}\xi_{n}}\right]&{}={}&\sum\limits_{r=1}^{\infty}\int{\textbf{1}}_{\left\{\xi_{1}\leq 1,\,\ldots\,,\,\xi_{r-1}\leq 1\,,\,\xi_{r}>1 \right\}}e^{t\left(\sum\limits_{n=1}^{r}z_{n}\right)}f_{z_1}\ldots f_{z_r}dz_1\ldots dz_r\newline
&{}={}&\sum\limits_{r=1}^{\infty}\left(\dfrac{\lambda}{\lambda{}-{}t}\right)^r e^{t-\lambda}\left(1{}-{}e^{t-\lambda}\right)^{r-1}\newline
&{}={}&e^{t-\lambda}\dfrac{\lambda}{\lambda{}-{}t}\sum\limits_{r=1}^{\infty}\left(\dfrac{\lambda}{\lambda{}-{}t}\right)^{r-1} \left(1{}-{}e^{t-\lambda}\right)^{r-1}\newline
&{}={}&\left(\dfrac{e^{t-\lambda}\dfrac{\lambda}{\lambda{}-{}t}}{1{}-{}\left(\dfrac{\lambda}{\lambda{}-{}t}\right) \left(1{}-{}e^{t-\lambda}\right)}\right)\,\newline
&{}={}&\dfrac{1}{1{}-{}e^{\lambda{}-{}t}t/\lambda}\,.
\end{eqnarray*}
$$
This looks like $\sum\limits_{n=1}^{\nu}\xi_{n}$ is trying to be exponentially distributed with "rate" $\lambda/e^{\lambda{}-{}t}$, but I do not know this functional form by heart. Any ideas?
Edit:
Not knowing the explicit inverse of the final generating function form above, I thought of examining each term in the equivalent series representation: perhaps the individual terms have nicer inverses. If this is the case, then a series representation might be sufficient. If we perform the substitution $u{}={}t/\lambda$, so that $u<1$, note that the moment generating function may be re-written as
$$
\sum\limits_{r=1}^{\infty}\left(\dfrac{1}{1-u}\right)^r\left(1-e^{\lambda\left(u-1\right)}\right)^{r-1}e^{\lambda\left(u-1\right)}{}={}\sum\limits_{r=1}^{\infty}\sum\limits_{k=0}^{r-1}{r-1\choose k}\left(\dfrac{1}{1-u}\right)^r(-)^ke^{(k+1)\lambda(u-1)}\,.
$$
A series representation may be obtained, therefore, if we can invert the "atomic" moment generating functions
$$
\left(\dfrac{1}{1-u}\right)^r e^{(k+1)\lambda(u-1)}\,.
$$
Heuristically, we wish to solve an integral of the kind
$$
\int\limits_{-\infty}^{1}\left(\dfrac{1}{1-u}\right)^r e^{(k+1)\lambda(u-1)-xu}\,\,\mbox{d}u\,.
$$
For $x<\lambda(k+1)$, the integral's solution has the form
$$
(-\lambda)^{r}e^{-x}\left(\dfrac{x^{r-1}}{(r-1)!}\log(\lambda(k+1)-x){}+{}f_{r-1}(k)\right)
$$
where $f_{r-1}(k)$ is a rational function involving terms of, at most, degree "$r-1$" in $k$.
(Note: the explicit solution can be obtained by integrating the expression $\dfrac{(-\lambda)^re^{-x}}{\lambda(k+1)-x}$, "$r$"-times, w.r.t "$k$". A justification of this follows by differentiating the integral we wish to solve. Note, also, that our "$u$" substitution above was merely to make this presentation look nicer and puts this solution "off" by a factor of $\lambda$: the actual solution follows analogous operations using the $t$ variable, instead).
Best Answer
Note: My analysis assumes $\lambda=1$, but can be readily extended to the $\lambda \neq 1$ case.
We can write $\mathbb{P}(Y \leq y, N=k) = \mathbb{P}(Y \leq y | N=k)\mathbb{P}(N=k)$.
Let's use the fact that the PDF of the sum of two independent random variables is given by the convolution of the the respective PDFs. For $k=1$, Y|N=1 = $X_1 ~\sim exp(\lambda)$. For $k=2$, $Y|N=2=X_1+X_2$ and $f_{Y|N=2}(\cdot) = f_x(\cdot) \star f_x(\cdot)$. The convolution can be computed (by definition) as:
$f_{Y|N=2}(y) = \int_{0}^{y} e^{-x}e^{-(y-x)}dx = ye^{-y} \; \forall \; y \geq 0$.
Extending to $k=3$, the PDF of $Y|N=3 = X_1+X_2+X_3$ can be computed as:
$f_{Y|N=3}(y) = \int_0^y x e^{-x} e^{-(y-x)} dx = \frac{1}{2}y^2 e^{-y} \; \forall \; y \geq 0$.
This can be generalised to (and proved via mathematical induction) to: $f_{Y|N=k}(y) = \frac{1}{(k-1)!}y^{k-1}e^{-y} \; \forall \; y \geq 0$.
Now that we have the PDF, we can compute the CDF as:
$\mathbb{P}(Y \leq y | N=k) = \int_0^y f_{Y|N=k}(x)dx = \int_0^y \frac{1}{(k-1)!}x^{k-1}e^{-x} dx = \frac{1}{(k-1)!}M_k(y)$.
Let's use integration by parts to evaluate the above expression. We have the recursion: $M_k(y) = \int_0^y x^{k-1}e^{-x}dx = -x^{k-1}e^{-x} |_0^y + \int_0^y (k-1)x^{k-2}e^{-x} dx = -y^{k-1}e^{-y} + (k-1)M_{k-1}(y)$. With a bit of algebra one can show that $M_k(y) = (k-1)! - e^{-y} \sum_{j=0}^{k-1} \frac{(k-1)!}{j!} y^j $. Substituting in our previous expression, we get:
$\mathbb{P}(Y \leq y | N=k) = 1 - e^{-y}\sum_{j=0}^{k-1} \frac{y^j}{j!} $.
Finally, the desired joint distribution is given by:
$\mathbb{P}(Y \leq y, N=k) = \left[1 - e^{-y}\sum_{j=0}^{k-1} \frac{y^j}{j!} \right](1-p)^kp$.
You can compute $\mathbb{P}(Y=y) = \sum_{k=1}^\infty \mathbb{P}(Y=y, N=k)$ and plug in the expression computed above. I am not sure if there is a closed form analytical answer.