[Math] Deriving the maximum likelihood estimator

maximum likelihoodprobabilityprobability theorystatistics

Suppose $X_1, X_2, X_3 \stackrel{\text{i.i.d.}} \sim \operatorname{Exp}(\theta)$.

Exercise: derive the maximum likelihood estimator based on $X = (X_1, X_2, X_3)$.

What I've tried: the likelihood is given by $\prod\limits_{i = 1}^3 f(X_i\mid \theta) \, d\theta = \prod\limits_{i = 1}^3 \theta e^{-3\theta x} \, d\theta$. The log-likelihood is given by $\log L = 3\log\theta – 3\theta x \log(e) = 3\log\theta – 3\theta x.$ Take the derivative and set it equal to $0$ and I get $\hat{\theta} = \frac{1}{x}$.

My question:

  • How do I derive the maximum likelihood estimator based on $X = (X_1, X_2, X_3)$? I know my solution is probably not correct, but I don't know what else I should try.

Best Answer

You're really close.

Note that $$L(\theta)=\prod_{i=1}^{3}f(X_i \mid \theta)=\prod_{i=1}^{3}\theta e^{-3\theta X_i} = \theta^3 e^{-\theta \sum_{i=1}^{3}X_i}$$ The loglikelihood is given by $$\ell(\theta) = 3\ln \theta-\theta\sum_{i=1}^{3}X_i$$ whose derivative is $$\ell^{\prime}(\theta) = \dfrac{3}{\theta}-\sum_{i=1}^{3}X_i=0\implies\hat{\theta}=\dfrac{3}{\sum_{i=1}^{3}X_i}\text{.}$$ Note that you may need to show that $\hat{\theta}$ is indeed the value of $\theta$ which maximizes $\ell$; to do this, notice that $$\ell^{\prime\prime}(\theta)=\dfrac{-3}{\theta^2}<0$$ regardless of the value of $\theta$.

Related Question