[Math] Maximum Likelihood Estimate with Multiple Parameters

maximum likelihoodparameter estimationprobability distributionsstatistical-inferencestatistics

I am not very familiar with multivariable calculus, but something tells me that I don't need to be in order to solve this problem; take a look:

Suppose that $X_1,…,X_m$ and $Y_1,…,Y_n$ are independent exponential random variables with $X_i\sim EXP(\lambda)$ and $Y_j\sim EXP(\theta \lambda)$.

Find the $MLE$ of $\lambda$ and $\theta$.

Finding the MLE of $\lambda$ is simple; by ignoring the $Y_j$ altogether and just looking at the $X_i$, it turns out to be $\sum x_i/m$. However, for $\theta$, I am no longer sure since the distribution of $Y_j$ is also dependent on $\lambda$. I don't know if I need to go as far as finding the gradient or if I can somehow use my previous result, but either way, I honestly don't know how to do it.

Any advice would be appreciated.

Best Answer

Deriving the MLE: From your specification of the problem, your log-likelihood function is:

$$\begin{equation} \begin{aligned} \mathcal{l}_{\boldsymbol{x},\boldsymbol{y}}(\theta, \lambda) &= \sum_{i=1}^m \ln p (x_i | \lambda) + \sum_{i=1}^n \ln p (y_i | \theta, \lambda) \\[8pt] &= \sum_{i=1}^m (\ln \lambda - \lambda x_i) + \sum_{i=1}^n (\ln \theta + \ln \lambda - \theta \lambda y_i) \\[8pt] &= m ( \ln \lambda - \lambda \bar{x} ) + n ( \ln \theta + \ln \lambda - \theta \lambda \bar{y}). \end{aligned} \end{equation}$$

This gives the score functions:

$$\begin{equation} \begin{aligned} \frac{\partial \mathcal{l}_{\boldsymbol{x},\boldsymbol{y}}}{\partial \theta}(\theta, \lambda) &= n \Big( \frac{1}{\theta} - \lambda \bar{y} \Big), \\[8pt] \frac{\partial \mathcal{l}_{\boldsymbol{x},\boldsymbol{y}}}{\partial \lambda}(\theta, \lambda) &= m \Big( \frac{1}{\lambda} - \bar{x} \Big) + n \Big( \frac{1}{\lambda} - \theta \bar{y} \Big). \end{aligned} \end{equation}$$

Setting both partial derivatives to zero and solving the resulting score equations yields the MLEs:

$$\hat{\theta} = \frac{\bar{x}}{\bar{y}} \quad \quad \quad \hat{\lambda} = \frac{1}{\bar{x}}.$$

(Note that in the case where $\bar{y} = 0$ the first of the score equations is strictly positive and so the MLE for $\theta$ does not exist.) As user121049 correctly points out, the MLE for $\lambda$ is the same as if you only used the $x_i$ values.