I'm having a little trouble with this question which I got from my statistics tutorial.
If $Y_1, Y_2, …, Y_n$ is a random sample from a population Y having a probability density function given by:
$$f(y : \alpha, \beta) =
\begin{cases}
\alpha\beta^\alpha y^{-(\alpha + 1)}, & \text{if $y$ $\gt$ $\beta$} \\
0, & \text{$otherwise$}
\end{cases}$$
where $\alpha \gt 0$ and $\beta$ is assumed to be known. Then, a method of moments estimator for $\alpha$ is given by:
a) $\widehat \alpha$ $=$ $\frac{M_1}{(M_1 – \beta)}$, where $M_1 = \frac {1}{n} \sum_{i=1}^n Y_i $
b) $\widehat \alpha$ $=$ $\frac{M_1}{(\beta – M_1)}$, where $M_1 = \frac {1}{n} \sum_{i=1}^n Y_i $
c) $\widehat \alpha$ $=$ $\frac{2M_2}{(M_2 + \beta^2)}$, where $M_2 = \frac {1}{n} \sum_{i=1}^n Y_i^2 $
d) None of the above
My attempt: I've tried referring to my lecture notes but I'm not quite sure how to attempt this question.
Thanks for your help!
Best Answer
Calculate the first moment of the distribution
$$ \mathbb{E}[Y] = \int_{0}^{+\infty}{\rm d}y~yf(y;\alpha,\beta) = \alpha\beta^{\alpha}\int_{\beta}^{+\infty}{\rm dy}~y^{-\alpha} = \frac{\alpha\beta}{\alpha - 1}, ~~~\alpha>1 $$
The method of moments relies on the assumption
$$ \frac{1}{n}\sum_{k=1}^ny_i = M_1 = \mathbb{E}[Y] = \frac{\alpha\beta}{\alpha - 1} $$
From this you can get $\alpha$
$$ \hat{\alpha} = \frac{M_1}{M_1 - \beta } $$