Deriving the MLE of $\theta$ for $f_X(x;\alpha,\theta)=\frac{1}{(\alpha-1)!\theta^{\alpha}}x^{\alpha-1}e^{-x/\theta}$

maximum likelihoodstatistics

Let $\{X_1,X_2,\cdots,X_n\}$ be a random sample of size $n$ from the following probability density function:
$$f_X(x;\alpha,\theta)=\begin{cases}
\dfrac{1}{(\alpha-1)!\theta^{\alpha}}x^{\alpha-1}e^{-x/\theta}; & x>0\\
0; & x \leqslant 0
\end{cases}$$

where $\alpha >0$ is known, and $\theta>0$.

Derive the MLE of $\theta$ (without verifying the solution is a maximum).

My attempt:

We have that the likelihood function is $$\prod^n_{i=1}f_x(x;\alpha,\theta)=\prod^n_{i=1}\frac{1}{(\alpha-1)!\theta^{\alpha}}x^{\alpha-1}e^{-x/\theta}=\frac{1}{((\alpha-1)!\theta^{\alpha})^n}x^{n(\alpha-1)}\exp\left(\frac{\sum^n_{i=1}x_i}{\theta}\right)$$

The log likelihood is $$\ln(L(\theta))=\ln\left(\frac{1}{((\alpha-1)!\theta^{\alpha})^n}\right)+\ln\left(x^{n(\alpha-1)}\right)+\frac{\sum x_i}{\theta}$$

Setting the derivative of the log likelihood to $0$ yields:

$$\frac{d}{\mathrm{d\theta}}\left(\ln\left(\frac{1}{\theta^{\alpha n}}\right)+\frac{\sum x_i}{\theta}\right)=0 \Leftrightarrow \theta^{\alpha n}(\theta^{\alpha n})'-\frac{\sum x_i}{\theta^2}=0\Leftrightarrow \theta^{\alpha n}(\alpha n-1)\theta^{\alpha n-1}-\frac{\sum x_i}{\theta^2}=0\Leftrightarrow \frac{\theta^{\alpha n+2}(\alpha n-1)-\sum x_i}{\theta^2}=0 \Leftrightarrow \theta^{\alpha n+2}(\alpha n-1)-\sum x_i = 0\Leftrightarrow \theta = \left(\frac{\sum x_i}{\alpha n -1}\right)^{\frac{1}{\alpha n+2}}$$

Is this correct?

Best Answer

Just an useful trick:

To avoid a lot of calculations, you can throw away from your likelihood any element not depending on your unknown parameter; in fact Likelihoods are equivalent unless a multiplicative constant.

So your density (which is a known densitiy: $X\sim \text{Gamma}[\alpha;\theta]$ or, equivalently, an Erlang) gives the following likelihood

$$L(\theta)\propto \theta^{-n\alpha}e^{-\Sigma_i X_i/\theta}$$

$$l(\theta)=-n \alpha \log\theta-\frac{\Sigma_i X_i}{\theta}$$

$$l^*(\theta)=-\frac{n\alpha}{\theta}+\frac{\Sigma_i X_i}{\theta^2}$$

Setting it =0 and solving w.r.t. $\theta$ you get

$$\hat{\theta}_{ML}=\frac{\Sigma_i X_i}{n\alpha}$$


Legenda:

$L=$ likelihood

$l=$ log-likelihood

$l^{*}=$ first derivative of $l$, a.k.a. "Score"

Related Question