Derive the likelihood function 𝐿(𝜃;Y) and thus the Maximum likelihood estimator 𝜃̂ (Y) for 𝜃. Show that the MLE is unbiased.

maximum likelihoodstatistical-inferencestatistics

Let $\underline{Y}= (Y_1,…, Y_n)$ be an i.i.d. random sample from a Weibull distribution, with probability density function given by $f(x; \lambda)= \frac{k}{\lambda}(\frac{y}{\lambda})^{k-1}exp$ {$-(\frac{y}{\lambda})^k$} where k > $0$ is a known shape parameter, and λ is an unknown scale parameter taking values in $\mathbb{R^+}$.

Consider the parametrisation $\theta= \lambda^k$

Derive the likelihood function $L(\theta; \underline{Y})$ and thus the Maximum likelihood estimator $\hat{\theta}(\underline{Y})$ for $\theta.$ Show that the MLE is unbiased.

What I know so far

take the sum of the pdf up to n to find the likelihood function. take the log and differentiate and then set to $0$ and solve for the MLE. If the expectation is 0 then the estimator is unbiased. I know the method but I am unsure of how to actually put it into practice. Any help would be greatly appreciated.

Best Answer

First rewrite the density with the new parametrization

$$f(y|\theta)=\frac{ky^{k-1}}{\theta}e^{-\frac{y^k}{\theta}}$$

Calculate the likelihood

$$L(\theta)\propto \theta^{-n}e^{-\frac{\Sigma_i y_i^k}{\theta}}$$

proceeding in the calculation you find that the score function (derivative of the log likelihood with respect to $\theta$) is

$$l^*=-\frac{n}{\theta}+\frac{1}{\theta^2}\Sigma_i y_i^k$$

And thus

$$T=\hat{\theta}_{ML}=\frac{\Sigma_i y_i^k}{n}$$

To show that $\mathbb{E}[T]=\theta$ let's rewrite the score function in the following way

$$l^*=-\frac{n}{\theta}+\frac{nT}{\theta^2}$$

Now simply remembering that (First Bartlett Identity)

$$\mathbb{E}[l^*]=0$$

you get

$$\frac{n}{\theta}=\frac{n\mathbb{E}[T]}{\theta^2}$$

that is also

$$\mathbb{E}[T]=\theta$$

To calculate its variance, using II Bartlett Identity, that is

$$\mathbb{E}[l^{**}]=-\mathbb{E}[(l^*)^2]$$

This identity leads to

$$\mathbb{V}\Bigg[\frac{nT}{\theta^2}-\frac{n}{\theta}\Bigg]=-\mathbb{E}\Bigg[\frac{n}{\theta^2}-\frac{2nT}{\theta^3}\Bigg]$$

that is

$$\frac{n^2}{\theta^4}\mathbb{V}[T]=\frac{n}{\theta^2}$$

$$\mathbb{V}[T]=\frac{\theta^2}{n}$$


Alternative method to calculate expectation and variance of T

Simply transforming

$$W=Y^k$$

you get that $W\sim Exp\Big(\frac{1}{\theta}\Big)$ thus

$$T\sim Gamma\Big(n;\frac{n}{\theta}\Big)$$

thus immediately you get

$$\mathbb{E}[T]=\frac{n}{\frac{n}{\theta}}=\theta$$

$$\mathbb{V}[T]=\frac{n}{\Big(\frac{n}{\theta}\Big)^2}=\frac{\theta^2}{n}$$