Maximum/minimum of exponential random variables

exponential distributionprobabilitysolution-verification

Let $T_i$ be the time (in hours) at which you first see a creature of type $i,$ for $1 ≤ i ≤ n.$
Suppose that ($T_i$, $1 ≤ i ≤ n$) are independent, and that $T_i$ has exponential distribution with parameter $\lambda_i$.

  1. What is the distribution of the time you see your first creature i.e. the minimum of the $T_i$?
  2. What is the expected number of types of creature that you have not met by time $1$?
  3. Now let all $\lambda_i$ be the same, $\lambda_i = \lambda$ for all $i.$ Let $M$ be the maximum of the $T_i$. Find
    the median of the distribution of $M.$

My solution so far:

  1. Let $X$ be the minimum of the $T_i$. Then the distribution of X is $f_X(t) = (\lambda_1+\cdots+\lambda_n)e^{-(\lambda_1 + \cdots + \lambda_n)t}$.
  2. Let $N$ be the number of types of creature not seen by time $1.$ Then $E(N) = e^{-\lambda_1}+\cdots+e^{-\lambda_n}$.
  3. Let M be the maximum of the $T_i$. Then $P(M\leq m) = (1-e^{-\lambda m})^n$. To find the median, solve $ P(M\leq m) =(1-e^{-\lambda m})^n = \frac{1}{2}$ for $m$. I get that the median is $m = \frac{-1}{\lambda}\log\left(1-\frac{1}{2^{1/n}}\right)$.

Would be grateful for thoughts and comments on these answers.

Best Answer

They are correct, but incomplete. You should really show your working so people checking your work have an easier time. Still this was not that obscure.

  1. $\displaystyle f_X(t)=\ldots=\dfrac{\mathrm d ~~}{\mathrm d t}\Big(1-\prod_{i=1}^n\mathsf P(T_i>t)\Big)=\ldots = \mathrm e^{\left[-t\sum_{i=1}^n\lambda_i\right]}\cdot\sum_{i=1}^n\lambda_i$

    You might also have argued that the arrival times for any creature will be a Poisson process with a rate of $\sum_{i=1}^n\lambda_i$.

    Although this is the probability density function rather than the cumulative distribution function. However the later is just the antiderivative, so...

  2. $\mathsf E(N)=\ldots=\sum_{i=1}^n\mathsf P(T_i>1)=\sum_{i=1}^n\mathrm e^{-\lambda_i}$ by reason of linearity of expectation, and use of Bernoulli indicator random variables.

  3. This was much better. Your explanation was easy to follow and working correct. However, you should make the base of the logarithm explicit, using $\log_\mathrm e$ or $\ln$.