Solved – Cramer-Rao lower bound questions

estimationmaximum likelihoodself-study

I've been reviewing questions from a statistics exam of the last year. There is a question with the probability density function below

$$\displaystyle f(x,\theta) = \frac 1{2\theta^3}x^2e^{-\frac x\theta}$$

where $\displaystyle0<x<\infty$ and $\displaystyle0<\theta<\infty$.

Question is as follows:

a) Find the maximum likelihood estimator for $\displaystyle\theta$.

b) Find the minimum variance estimator $\widehat{\theta}$ for the given pdf.

c) Is the maximum likelihood estimator obtained in a) efficient and consistent?

I have found the answer to a) as $\widehat\theta=\frac {1} {3n}\sum x_n$.

However, I couldn't find the Cramer-Rao lower bound to the end. I've shown the equation below

$$\displaystyle \frac {d^2\log(f(x,\theta))} {d\theta^2} = \frac 3 {\theta^2}-2\frac x {\theta^3}$$

I should take the expectation of this, but I'm not sure what should I replace $E[x]$ with. Should I just make it equal to $\displaystyle \theta$ and go further with the solution?

By the way, could you also hint me how can I find the variance of the maximum likelihood estimator above for c)? I'd appreciate it

Thanks in advance

Best Answer

To close this one:
The density is recognized as a Gamma distribution with shape parameter $k=3$ and unknown scale parameter $\theta$, so we have $E(X) = 3\theta$ and $\operatorname {Var}(X) = 3\theta^2$ Given an i.i.d sample, the expected value and variance of the MLE is then

$$E(\hat \theta_{MLE}) = \theta,\;\;\operatorname {Var}(\hat \theta_{MLE}) = \frac 1 {3n}\theta^2$$

For consistency, we can use then the sufficient conditions: $\lim_{n\rightarrow}E(\hat \theta_{MLE})=\theta$, holds, and $\lim_{n\rightarrow}\operatorname {Var}(\hat \theta_{MLE})=0 \Rightarrow \lim_{n\rightarrow}\frac 1 {3n}\theta^2=0$, holds too, so the MLE is consistent.

The Central Limit Theorem Holds and so

$$\sqrt n(\hat \theta_{MLE}-\theta) \rightarrow_d \mathcal N(0, \frac 13\theta^2)$$

The Fisher Information is $$\mathcal I(\theta) = -E\left[\frac 3 {\theta^2}-2\frac X {\theta^3}\right] = -\frac 3 {\theta^2} + 2\frac {E(X)}{\theta^3} = -\frac 3 {\theta^2} + 2\frac {3\theta}{\theta^3} = \frac 3{\theta^2}$$

and so the MLE achieves the Cramer-Rao bound.

As for the question about minimum variance estimator, @cardinal's answer here
https://math.stackexchange.com/questions/28779/minimum-variance-unbiased-estimator-for-scale-parameter-of-a-certain-gamma-distr
is a complete and sufficient statistic for estimating the answer.