For a), use the law of total probability:
$$
{\rm P}(X_1 < X_2 ) = \int_0^\infty {{\rm P}(X_1 < X_2 |X_2 = t)f_{X_2 } (t)\,{\rm d}t} ,
$$
where $f_{X_2}$ is the PDF of $X_2$.
For b), notice that $\min \{ X_1 ,X_2 \} > t$ if and only if $X_1 > t$ and $X_2 > t$ (and use the fact that $X_1$ and $X_2$ are independent).
For c), calculate ${\rm P}(\min \{X_1,X_2 \}>t , X_1 > X_2 )$ using the law of total probability, conditioning on $X_2$. You should easily find that
$$
{\rm P}(\min \{X_1,X_2 \}>t , X_1 > X_2 ) = {\rm P}(\min \{X_1,X_2 \}>t ){\rm P}(X_1 > X_2 ) = \frac{{\lambda _2 }}{{\lambda _1 + \lambda _2 }}e^{ - (\lambda _1 + \lambda _2 )t}.
$$
For d), note that
$$
{\rm P}(|X_1 - X_2 | > t|N = 1) = \frac{{{\rm P}(X_2 - X_1 > t,X_1 < X_2 )}}{{{\rm P}(X_1 < X_2 )}} = \frac{{{\rm P}(X_2 > X_1 + t)}}{{{\rm P}(X_1 < X_2 )}},
$$
and you should easily show using the law of total probability, conditioning on $X_1$, that
$$
{P(X_2 > X_1 + t)} = \frac{{\lambda _1 }}{{\lambda _1 + \lambda _2 }}e^{ - \lambda _2 t}.
$$
Note: The calculation for ${\rm P}(|X_1 - X_2 | > t|N = 2)$ is completely analogous.
NOTE: Since question e) is not so easy, I give more than hints. However, try solving a significant part of it by yourself.
For e), it is straightforward to show, using that $N$ and $U$ are independent, that
$$
{\rm P}(W > t | U=u) = {\rm P}(W > t | N=1, U=u){\rm P}(N=1) + {\rm P}(W > t | N=2, U=u){\rm P}(N=2).
$$
For this purpose, you may replace $U=u$ by $U \in [u,u+{\rm d}u]$, where ${\rm d}u \to 0$, in order to condition on events with positive probability. Now, given $U=u$ and $N=1$, we have that $X_1 = u$ and that $X_2 - X_1$, by standard property of the exponential distribution, is exponential$(\lambda_2)$. Analogously, given $U=u$ and $N=2$, we have that $X_2 = u$ and that $X_1 - X_2$ is exponential$(\lambda_1)$. From this you should find that
$$
{\rm P}(W > t | U=u) = \frac{{\lambda _1 }}{{\lambda _1 + \lambda _2 }}e^{ - \lambda _2 t} + \frac{{\lambda _2 }}{{\lambda _1 + \lambda _2 }}e^{ - \lambda _1 t} .
$$
Now we are done by
$$
{\rm P}(W > t ) = {\rm P}(W > t ,N = 1) + {\rm P}(W > t ,N = 2),
$$
as it gives us, by virtue of a) and d),
$$
{\rm P}(W > t) = \frac{{\lambda _1 }}{{\lambda _1 + \lambda _2 }}e^{ - \lambda _2 t} + \frac{{\lambda _2 }}{{\lambda _1 + \lambda _2 }}e^{ - \lambda _1 t} .
$$
Best Answer
I think it is easy in this way:
First of all observe that
$$\mathbb{E}[X^k]=\int_0^{\infty} x^k\frac{1}{\theta}e^{-x/\theta}dx=\theta^k\underbrace{\int_0^{\infty}\left(\frac{ x}{\theta}\right)^ke^{-x/\theta}d\left(\frac{x}{\theta}\right)}_{=\Gamma(k+1)=k!}=k!\theta^k$$
Second, expand $(X-Y)^4$ and find the result using independence property
thus
$$\mathbb{E}[X-Y]^4=\theta^4[4!-4\cdot3!+6\cdot2!\cdot2!-4\cdot3!+4!]=24\theta^4$$
...the result you have to prove (after your editing) is done!