Find Maximum-Likelihood-Estimator (MLE) for $\alpha$

maximum likelihoodparameter estimationstatistics

Consider the following PDF:

$$w_{\alpha,\beta}(x):=\alpha \beta x^{\beta-1}e^{-\alpha x^{\beta}} \mathbf{1}_{(0,\infty)}(x)$$

This is the Weibull distribution often used in material science. Assume we know $\beta$ and we want to estimate $\alpha$. Let $X_1,\ldots X_n$ be i.i.d weibull-distributed.

  1. Find the MLE $\hat{\alpha}$ for the parameter $\alpha$.
  2. Find a $c \in \mathbb R$ such that $c \cdot \alpha $ is an unbiased estimator.

Question: The result I am getting for the MLE doesn't look correct but I don't know what I am doing wrong. For Part 2, do I just have to show that $\operatorname{E}(\hat{\alpha}-\alpha)=0$?

My attempt:

Step 1: Write down the ML function:

$$L(\alpha)=\prod_{i=1}^n \alpha \beta x_i^{\beta-1}e^{-\alpha x_i^{\beta}}$$

Step 2: Take the natural log:

$$ \begin{aligned}\ln(L(\alpha))&=\sum_{i=1}^n \ln\left(\alpha \beta x_i^{\beta-1}e^{-\alpha x_i^{\beta}} \right)
\\[5pt] &=\sum_{i=1}^n \ln\left(\alpha\beta x_i^{\beta-1}\right)+\ln \left( e^{-\alpha x_i^{\beta}}\right)
\\[5pt] &=\sum_{i=1}^n\ln\left(\alpha \right)+\ln(\beta)+(\beta-1)\ln\left(x_i \right)-\alpha x_i^{\beta} \end{aligned}$$

Step 3: Differentiate and set equal to zero:

$$\begin{aligned}&\frac{\partial }{\partial \alpha}\ln(L(\alpha))=\sum_{i=1}^n \frac{1}{\alpha}-x_i^{\beta}=0
\\[5pt] &\iff \sum_{i=1}^n\frac{1}{\alpha}=\sum_{i=1}^n x_i^{\beta}
\\[5pt] & \iff \frac{n}{\alpha}=\sum_{i=1}^nx_i^{\beta} \iff \alpha=\frac{n}{\sum_{i=1}^nx_i^{\beta}}\end{aligned}$$

For part 2 I was thinking of setting $c=\alpha \bar{X}$, where $\bar{X}$ is the average of the $x_i^{\beta}$'s. Then it would follow that: $$E\left[\alpha \cdot \frac{1}{n} \cdot \sum_{i=1}^n x_i^{\beta} \cdot \frac{n}{\sum_{i=1}^n x_i^{\beta}}\right]=\alpha \\ \iff \alpha =\alpha $$

But I am not sure if am allowed to set $c$ equal to that.

Best Answer

Hint to get you going: $X^\beta$ is distributed exponentially with parameter $\alpha$ [use the "transformation technique" for this].

Transformation technique: $Y=X^\beta\Rightarrow s(y)=x=y^{1/\beta}\Rightarrow \frac{ds(y)}{dy}=\frac 1\beta y^{\frac1\beta-1}$

$g(y)=\alpha\beta (y^{1/\beta})^{\beta-1}e^{-\alpha(y^{1/\beta})^\beta}\cdot \frac 1\beta y^{\frac1\beta-1}=\alpha e^{-\alpha y}$

Thus $\sum_{i=1}^n X_i^\beta$ is Gamma(n, $\alpha$). Then you would try to calculate the expectation of $\hat\alpha$ and wrt to the gamma pdf, and then unbias it by multiplying with a $c\in\mathbb R$. Let $Z=\sum_{i=1}^n X_i^\beta$.

$$\begin{split}E(\hat\alpha)&=\int_0^\infty\frac n{z}\cdot\frac{\alpha^n}{(n-1)!}z^{n-1}e^{-\alpha z}dz\\ &=\alpha\frac n{n-1}\int_0^\infty \frac{\alpha^{n-1}}{(n-2)!}z^{n-2}e^{-\alpha z}dz\\ &=\alpha \frac{n}{n-1}\end{split}$$

I get: $$c=\frac{n-1}{n}$$

$$E\left(c\hat\alpha - \alpha\right)=0$$

Related Question