For a), use the law of total probability:
$$
{\rm P}(X_1 < X_2 ) = \int_0^\infty {{\rm P}(X_1 < X_2 |X_2 = t)f_{X_2 } (t)\,{\rm d}t} ,
$$
where $f_{X_2}$ is the PDF of $X_2$.
For b), notice that $\min \{ X_1 ,X_2 \} > t$ if and only if $X_1 > t$ and $X_2 > t$ (and use the fact that $X_1$ and $X_2$ are independent).
For c), calculate ${\rm P}(\min \{X_1,X_2 \}>t , X_1 > X_2 )$ using the law of total probability, conditioning on $X_2$. You should easily find that
$$
{\rm P}(\min \{X_1,X_2 \}>t , X_1 > X_2 ) = {\rm P}(\min \{X_1,X_2 \}>t ){\rm P}(X_1 > X_2 ) = \frac{{\lambda _2 }}{{\lambda _1 + \lambda _2 }}e^{ - (\lambda _1 + \lambda _2 )t}.
$$
For d), note that
$$
{\rm P}(|X_1 - X_2 | > t|N = 1) = \frac{{{\rm P}(X_2 - X_1 > t,X_1 < X_2 )}}{{{\rm P}(X_1 < X_2 )}} = \frac{{{\rm P}(X_2 > X_1 + t)}}{{{\rm P}(X_1 < X_2 )}},
$$
and you should easily show using the law of total probability, conditioning on $X_1$, that
$$
{P(X_2 > X_1 + t)} = \frac{{\lambda _1 }}{{\lambda _1 + \lambda _2 }}e^{ - \lambda _2 t}.
$$
Note: The calculation for ${\rm P}(|X_1 - X_2 | > t|N = 2)$ is completely analogous.
NOTE: Since question e) is not so easy, I give more than hints. However, try solving a significant part of it by yourself.
For e), it is straightforward to show, using that $N$ and $U$ are independent, that
$$
{\rm P}(W > t | U=u) = {\rm P}(W > t | N=1, U=u){\rm P}(N=1) + {\rm P}(W > t | N=2, U=u){\rm P}(N=2).
$$
For this purpose, you may replace $U=u$ by $U \in [u,u+{\rm d}u]$, where ${\rm d}u \to 0$, in order to condition on events with positive probability. Now, given $U=u$ and $N=1$, we have that $X_1 = u$ and that $X_2 - X_1$, by standard property of the exponential distribution, is exponential$(\lambda_2)$. Analogously, given $U=u$ and $N=2$, we have that $X_2 = u$ and that $X_1 - X_2$ is exponential$(\lambda_1)$. From this you should find that
$$
{\rm P}(W > t | U=u) = \frac{{\lambda _1 }}{{\lambda _1 + \lambda _2 }}e^{ - \lambda _2 t} + \frac{{\lambda _2 }}{{\lambda _1 + \lambda _2 }}e^{ - \lambda _1 t} .
$$
Now we are done by
$$
{\rm P}(W > t ) = {\rm P}(W > t ,N = 1) + {\rm P}(W > t ,N = 2),
$$
as it gives us, by virtue of a) and d),
$$
{\rm P}(W > t) = \frac{{\lambda _1 }}{{\lambda _1 + \lambda _2 }}e^{ - \lambda _2 t} + \frac{{\lambda _2 }}{{\lambda _1 + \lambda _2 }}e^{ - \lambda _1 t} .
$$
Define $c_n$ as the infimum value of $P[X_1 + ... + X_n<n+1]$ over all distributions for i.i.d. nonnegative random variables $\{X_i\}$ with $E[X_i]=1$. Here I prove the simple upper and lower bounds:
$$ \frac{1}{n+1} \leq c_n \leq (1-\frac{1}{n+1})^n \quad, \forall n \in \{1, 2, 3, ...\} $$
Notice that the upper and lower bounds meet when $n=1$, so $c_1=1/2$.
Achievability (upper bound):
Consider the nonnegative random variables
$$ X_i = \left\{ \begin{array}{ll}
n+1 &\mbox{ , with prob $\frac{1}{n+1}$} \\
0 & \mbox{ , with prob $1-\frac{1}{n+1}$}
\end{array}
\right.$$
These have $E[X_i]=1$ and:
$$P[X_1 + ... + X_n<n+1] = P[\mbox{all $X_i$ are zero}] = (1-\frac{1}{n+1})^n$$
Hence, $c_n \leq (1-\frac{1}{n+1})^n$.
Lower bound:
Let $\{X_i\}$ be any (possibly dependent and non-identically distributed) nonnegative random variables with $E[X_i]=1$. By the Markov inequality:
$$ P[X_1 + ... + X_n\geq n+1] \leq \frac{E[X_1+...+X_n]}{n+1} = \frac{n}{n+1}$$
and hence $P[X_1 + ... + X_n < n+1] \geq \frac{1}{n+1}$. Hence, $c_n \geq \frac{1}{n+1}$.
Best Answer
Assuming the $\ X_j\ $ are independent, \begin{eqnarray} P\left(\,\min\left(X_1,X_2,\dots,X_n\,\right)=X_i\right)&=&P\left(X_i \le X_j\ \mbox{ for } j\ne i\right)\\ &=& \int_\limits{0}^\infty P\left(t\le X_j\ \mbox{ for } j\ne i\left|X_i=t\right.\right)\lambda_i e^{-\lambda_i t} dt\\ &=& \int_\limits{0}^\infty \prod_\limits{j\ne i}P\left(t\le X_j\right)\lambda_i e^{-\lambda_i t} dt\\ &=& \int_\limits{0}^\infty \lambda_i e^{-\sum_{j=1}^n\lambda_jt} dt\\ &=& \frac{\lambda_i}{\sum_\limits{j=1}^n\lambda_j}\ . \end{eqnarray}