[Math] Method of moments with a Gamma distribution

gamma distributionparameter estimationstatistics

I'm more so confused on a specific step in obtaining the MOM than completely obtaining the MOM:

Given a random sample of $ Y_1 , Y_2,…, Y_i$ ~ $ Gamma (\alpha , \beta)$ find the MOM

So I found the population and sample moments

$u_1^{'}= \alpha \beta $
$ u_2^{'} = \sigma^2 + \mu^2 = \alpha ^2 \beta^2 + \alpha \beta^2$
$ m_1^{'} = \overline Y $

$ m_2^{'} = \frac{1}{n} \sum_{i=1}^{n} Y_i^2 $

solving for $ \hat \alpha_{MOM}$ I get $\hat \alpha_{MOM} = \frac{\overline Y}{\beta}$

The solutions say this ends up being equal to:

$ \hat \alpha_{MOM} = \frac{\frac{1}{n} (\sum_{i=1}^{n} (Y_i – \overline Y) ^ 2)}{n \overline Y} $

I think I'm forgetting some property because I have no idea how they transformed the initial $ \hat \alpha_{MOM}$ equation into the 2nd $ \hat \alpha_{MOM}$ equation.

Best Answer

Based on your expressions for the first and second raw moments, I will assume that the gamma distribution is parametrized by shape $\alpha$ and scale $\beta$; i.e., $$f_Y(y) = \frac{y^{\alpha - 1} e^{-y/\beta}}{\beta^\alpha \Gamma(\alpha)}, \quad y > 0.$$ In such a case, equating on raw (uncentered) sample moments gives the system $$\begin{align*} \bar y_1 &= \alpha \beta, \\ \bar y_2 &= \alpha^2 \beta^2 + \alpha \beta^2 = \alpha (\alpha + 1) \beta^2 \end{align*}$$ where $\bar y_k = \frac{1}{n} \sum_{i=1}^n y^k$ is the $k^{\rm th}$ raw sample moment. This system is easily solved by substitution; the first equation yields $\beta = \bar y_1/\alpha$, and substituting this into the second implies $\bar y_2 = \alpha(\alpha+1)\bar y_1^2/\alpha^2 = \left(1 + \frac{1}{\alpha} \right) \bar y_1^2$. Consequently, $$\hat \alpha = \frac{1}{\bar y_2 / \bar y_1^2 - 1} = \frac{\bar y_1^2}{\bar y_2 - \bar y_1^2}.$$ Substituting this result into the first equation then yields $$\hat \beta = \frac{\bar y_2}{\bar y_1} - \bar y_1.$$ In terms of your notation, this is $$\hat \alpha = \frac{m_1'^2}{m_2'-m_1'}, \quad \hat \beta = \frac{m_2'}{m_1'} - m_1'.$$ There is, however, a relationship between the second central sample moment $$\hat \sigma^2 = \frac{1}{n} \sum_{i=1}^n (y_i - \bar y_1)^2$$ and $\bar y_1$ and $\bar y_2$; namely $$\begin{align*} \hat \sigma^2 &= \frac{1}{n} \sum_{i=1}^n \left( y_i^2 - 2 \bar y_1 y_i + \bar y_1^2 \right) \\ &= \bar y_2 - 2 \bar y_1 \bar y_1 + \bar y_1^2 \\ &= \bar y_2 - \bar y_1^2, \end{align*}$$ which should be reminiscent of the variance formula $\operatorname{Var}[Y] = \operatorname{E}[Y^2] - \operatorname{E}[Y]^2$, except here we are dealing with a sample rather than expectations. Then, if we also write $\hat \mu = \bar y_1$, we may write $$\hat \alpha = \frac{\hat \mu^2}{\hat \sigma^2}, \quad \hat \beta = \frac{\hat \sigma^2}{\hat \mu}.$$ It is here that we see that the expression $$\hat \alpha_{\text{MOM}} = \frac{\frac{1}{n}\sum_{i=1}^n (y_i - \bar y)^2}{n \bar y}$$ referenced in your question cannot possibly be correct. The only estimator that remotely resembles this is the one for $\beta$, not for $\alpha$.

It is also worth noting that if one performs the method of moments estimation on the central sample moments $\hat \mu$ and $\hat \sigma^2$, that is to say, to solve the system $$\begin{align*} \hat \mu &= \alpha \beta, \\ \hat \sigma^2 &= \alpha \beta^2, \end{align*}$$ we get $$\hat \alpha = \frac{\hat \mu^2}{\hat \sigma^2}, \quad \hat \beta = \frac{\hat \sigma^2}{\hat \mu},$$ which, unsurprisingly, is the same as the estimators we found by matching on the raw moments. The only difference is that this calculation is easier to derive and utilize, if what is desired is an estimator in terms of the central sample moments.

Related Question