Exam prep: maximum likelihood estimator

maximum likelihoodprobability distributionssolution-verificationstatistics

Suppose that $(Y_1,Z_1),\dots,(Y_n,Z_n)$ are $n$ independent, identically distributed stochastic vectors, that $Y_i\perp\!\!\!\perp Z_i$ and $Y_i = Exp(\lambda), Z_i=Exp(\mu)$ for $i=1,\dots,n$.

  • Find the MLE for $(\lambda,\mu)$.
  • Consider $X_i=\min(Y_i,Z_i)$. Now, we observe $X_i$ with an indicator $\Delta_i = 1_{(X_i=Y_i)}$.

    1. Determine the joint PDF of the observed data $\{(X_i,\Delta_i)\}$.

    2. Find the MLE for $(\lambda,\mu)$ based on these observed data.

My attempt:

  • This is not too complicated: $\lambda=1/\bar{Y}_n$ and $\mu=1/\bar{Z}_n$.

    1. I'm a little stuck here. I already found $X_i = Exp(\lambda+\mu)$, and $\Delta_i=Ber(p)$, where $p=P(\Delta_i=1)=P(X_i=Y_i)=P(Z_i>Y_i)=\cdots=\frac{\lambda}{\lambda+\mu}$. Also, $$ f_{X_i,\Delta_i}(x,\delta)= \sum_{\delta=0}^1 f_{\Delta_i}(\delta)f_{X_i\mid\Delta_i}(x|\delta).$$
      Now $f_{X_i|\Delta_i}(x|0)$ corresponds to the PDF of $X_i$ when $\Delta_i=0\iff X_i\ne Y_i$. Therefore $X_i=Z_i$ and the PDF is the same as the PDF of $Z_i$, i.e. $\mu e^{-\mu x}$. Similarly, for $\delta=1$. We find: $$ f_{X_i,\Delta_i}(x,\delta)=\frac{\mu^2}{\lambda+\mu}e^{-\mu x}+\frac{\lambda^2}{\lambda+\mu}e^{-\lambda x}.$$
      The answer should be $(\lambda e^{-(\mu+\lambda)x})^{\delta}(\mu e^{-(\mu+\lambda)x})^{1-\delta}$.

Where is my mistake?

Thanks.

Best Answer

The first problem is here: you should write $$ f_{X_i,\Delta_i}(x,\delta) = f_{\Delta_i}(\delta)f_{X_i\mid\Delta_i}(x|\delta). $$ The function $f_{X_i,\Delta_i}(x,\delta)$ depends on $x$ and $\delta$. When you sum over $\delta$, you get $f_{X_i}(x)$.

The second problem: $f_{X_i\mid\Delta_i}(x|1)\neq f_{Y_1}(x)$, $f_{X_i\mid\Delta_i}(x|0)\neq f_{Z_1}(x)$. $f_{X_i\mid\Delta_i}(x|1)$ is pdf of conditional distribution of $Y_i$ given $Y_i<Z_i$. Find its CDF first: $$ F_{X_i|\Delta_i}(x|1) = \mathbb P(Y_i \leq x \mid Y_i <Z_i) = \frac{\mathbb{P}(Y_i\leq x, Y_i<Z_i)}{\mathbb P(Y_i<Z_i)}. $$ $$ =\frac{\int_0^x \lambda e^{-\lambda y} \int_y^\infty \mu e^{-\mu z}\, dz\, dy}{\frac{\lambda}{\lambda+\mu}} = 1-e^{-(\lambda+\mu)x} $$ This answer is the same as unconditional distribution of $X_i$. So the conditional pdf is $$ f_{X_i\mid\Delta_i}(x|1) = (\lambda+\mu)e^{-(\lambda+\mu)x} $$ and $$ f_{X_i, \Delta_i}(x,1) =\frac{\lambda}{\lambda+\mu} \cdot (\lambda+\mu)e^{-(\lambda+\mu)x} = \lambda e^{-(\lambda+\mu)x}. $$ The same way $$ f_{X_i, \Delta_i}(x,0) =\frac{\mu}{\lambda+\mu} \cdot (\lambda+\mu)e^{-(\lambda+\mu)x} = \mu e^{-(\lambda+\mu)x}. $$ If we want to write it in one expression, we can either use indicators $\mathbb 1_{\delta=1}=\delta$, $\mathbb 1_{\delta=0}=1-\delta$ $$ f_{X_i, \Delta_i}(x,\delta) =\delta\lambda e^{-(\lambda+\mu)x}+ (1-\delta)\mu e^{-(\lambda+\mu)x}, $$ or use power function (which is more convinient in most cases): $$ f_{X_i, \Delta_i}(x,\delta) =\left(\lambda e^{-(\lambda+\mu)x}\right)^\delta \cdot \left(\mu e^{-(\lambda+\mu)x}\right)^{1-\delta}. $$

Related Question