[Math] Finding an efficient estimator for $ \beta $ in a sample of $ n $ random variables having the $ \text{Gamma}(\alpha,\beta) $-distribution.

statistics

Problem: Suppose that we have i.i.d. random variables $ X_{1},\dots,X_{n} \sim \text{Gamma}(\alpha,\beta) $, where $ \alpha > 0 $ is known. Find an efficient estimator for $ \beta $.

Recall that the probability density function of the $ \text{Gamma}(\alpha,\beta) $-distribution is given by
$$
\forall x > 0: \quad f(x;\alpha,\beta) = \frac{1}{\Gamma(\alpha) \beta^{\alpha}} \cdot x^{\alpha – 1} e^{- x/\beta}.
$$

I am a little lost, but I am guessing that I need to find the Cramér-Rao Lower Bound (CRLB), look for an unbiased estimator and then compare it to the CRLB. Any help would be greatly appreciated.

Best Answer

The following is a solution, presented in the spirit of rigorous measure-theoretic statistics.


Fix $ \alpha > 0 $. We are considering the one-parameter family $ (P_{\beta})_{\beta > 0} $ of probability distributions, where each $ P_{\beta} $ is the $ \text{Gamma}(\alpha,\beta) $-distribution. According to this webpage, an efficient estimator for $ (P_{\beta})_{\beta > 0} $ using a random sample of size $ n $ is the following Borel-measurable function $ T: \mathbb{R}^{n} \to \mathbb{R} $: $$ \forall (x_{1},\ldots,x_{n}) \in \mathbb{R}^{n}: \quad T(x_{1},\ldots,x_{n}) \stackrel{\text{def}}{=} \frac{1}{n \alpha} \sum_{k=1}^{n} x_{k}. $$ To prove that $ T $ is an efficient estimator for $ (P_{\beta})_{\beta > 0} $, we need to do the following:

  • Prove that $ T $ is an unbiased estimator for $ (P_{\beta})_{\beta > 0} $: $$ \forall \beta > 0: \quad X_{1},\ldots,X_{n} \sim P_{\beta} ~~ \Longrightarrow ~~ \text{E} \left[ T(X_{1},\ldots,X_{n}) \right] = \text{E} \left[ \frac{1}{n \alpha} \sum_{k=1}^{n} X_{k} \right] = \beta. $$

  • Derive the Cramér-Rao Inequality for $ T $ with respect to $ (P_{\beta})_{\beta > 0} $: $$ \forall \beta > 0: \quad X_{1},\ldots,X_{n} \sim P_{\beta} ~~ \Longrightarrow ~~ \text{Var} \left[ T(X_{1},\ldots,X_{n}) \right] \geq \frac{\beta^{2}}{n \alpha}. $$

  • Prove that equality is actually attained by $ T $ in the inequality above: $$ \text{Var} \left[ T(X_{1},\ldots,X_{n}) \right] = \frac{\beta^{2}}{n \alpha}. $$


Conclusion: $ T $ is an efficient estimator for $ (P_{\beta})_{\beta > 0} $ using a random sample of size $ n $.