Use MGF to show $\hat\beta$ is a consistent estimator of $\beta$

convergence-divergencestatistical-inferencestatistics

Suppose that $X_1,….,X_n$ is a random sample from a gamma distribution with parameters $\alpha= 2, \beta$.

\begin{equation}
f(x)= \frac{x e^{(-x/ \beta)}}{\beta^2}, x>0
\end{equation}

(a) Find the maximum likelihood estimator $\hat\beta$ of $\beta$ and show it is unbiased.

(b) Use the moment generating function (MGF) to show that $\hat\beta$ is a consistent estimator of $\beta$.

I Know the answer for part (a) $\hat\beta= \frac{1}{2n} \sum {X_i}$ and then show $E[\hat\beta]= \beta$.

I need help in part (b) I know how to show that an estimator is consistent by using convergence in probability definition or
Chebyshev's Inequality. but I have no idea how to use MGF to show an estimator is consistent.

Best Answer

Show that the mgf of $\hat{\beta},$ $\mathbb{E}(\exp(\hat{\beta}t))$ converges to $\exp(\beta t).$ Note that $\exp(\beta t)$ is the MGF of the degenerate random variable $\beta.$

The convergence of MGF implies convergence in distribution. Reference

Convergence in distribution to a constant implies convergence in probability to the same constant. Reference

Related Question