Method of Moments estimation

probabilitystatistics

I am sorry in advance if this question seems like low effort, but I really do not know how to solve this problem. My understanding of method of moments estimation is bad. Here is the definition of method of moments estimation in my book:

Let $\{X_1,X_2,…,X_n\}$ be a random sample from a population $F(x;\theta)$. Suppose $\theta$ has $p$ components (for example, for a normal popoulation $N(\mu, \sigma^2),p=2$; for Poisson population with parameter $\lambda$, $p=1$).

Let $$\mu_k=\mu_k(\theta)=E(X^k)$$

denote the kth population moment, for k=1,2,… Therefore, $\mu_k$ depends on the unknown parapemter $\theta$, as everything else about the distribution $F(x;\theta)$ is known.

Denote the $k$th sample moment by:

$$M_k=\frac{1}{n}\sum^n_{i=1}X_i^k=\frac{X_1^k+X_2^k+…+X_n^k}{n}.$$

The MM estimator (MME) $\hat{\theta}$ of $\theta$ is the solution of the $p$ equations $$\mu_k(\hat{\theta})=M_k \text{ for } k=1,2,…,p$$


So my problem I am struggling to solve is the following:

An economist decides to model the distribution of income in a country with the probability density function: $$f_x(x;\alpha,k)=\frac{\alpha k^{\alpha}}{x^{\alpha+1}} \text{ for } x\geq k$$

and $0$ otherwise, where $k>0$ and $\alpha >2$. Let $\{X_1,X_2,…,X_n\}$ be a random sample of size $n$ from this distribution. You may use the fact that:

$$E(X)=\frac{\alpha k}{\alpha-1} \text{ and } E(X^2)=\frac{\alpha k^2}{\alpha-2}$$

Show that the method of moments estimators of $\alpha$ and $k$ are the solutions of:

$$\frac{1}{\hat{\alpha}(\hat{\alpha}-2)}=\left(\frac{n-1}{n}\right)\frac{S^2}{\overline{X}^2} \text{ and } \hat{k}=\frac{(\hat{\alpha}-1)\overline{X}}{\hat{\alpha}}$$

where: $$\overline{X}=\frac{1}{n}\sum^n_{i=1}X_i\text{ and } S^2=\frac{1}{n-1}\sum^n_{i=1}(X_i-\overline{X})^2$$


My attempt to solve this:

We have that $\mu_1(\alpha,k)=\frac{\alpha k}{\alpha-1}$, $\mu_2(\alpha,k)=\frac{\alpha k^2}{\alpha – 2}$

So we have to solve the following equations:

$$\mu_1(\alpha,k)=M_1 \text{ and } \mu_2(\alpha,k)=M_2$$

We have that $$\mu_1(\alpha,k)=M_1\Rightarrow \frac{\alpha k }{\alpha-1}=\overline{X}$$

Now,

$$\mu_2(\alpha,k)=\frac{1}{n}\sum^n_{i=1}X^2_i\Rightarrow \frac{\alpha k^2}{\alpha-2}=\frac{1}{n}\sum^n_{i=1}X^2_i$$

So from the first equation we have that $$\frac{\alpha k}{\overline{X}}-1=\alpha-2$$

Plugging $\alpha-2$ into the first equation we get $$\frac{\alpha k^2}{\frac{\alpha k}{\overline{X}}}=\frac{1}{n}\sum^n_{i=1}X^2_i\Leftrightarrow k\overline{X}=\frac{1}{n}\sum^n_{i=1}X^2_i\Leftrightarrow \hat{k}=\frac{1}{\overline{X}}\sum^n_{i=1}X^2_i$$

Now I'd like to find $\hat{\alpha}$. So for $k$ we plug in $\hat{k}$ (in equation $\mu_1(\alpha,k)=M_2$):

$$\frac{\alpha k}{\alpha -1}=\overline{X}\Leftrightarrow \frac{alpha\frac{1}{n\overline{X}}\sum^n_{i=1}X^2_i}{\alpha-1}=\overline{X}\Leftrightarrow \frac{alpha \sum X^2_i}{n\overline{X}(\alpha-1)}=\overline{X}\Leftrightarrow \alpha = \frac{(\alpha-1)n\overline{X}^2}{\sum X_i^2}\text{ this is equation (1) }$$

Now we plug in $\hat{k}$ in the equation $\mu_2(\alpha,k)=M_2$:

$$\frac{\alpha k^2}{\alpha-2}=\sum X_i^2\Leftrightarrow \frac{\alpha\frac{1}{\left(n\overline{X}\right)^2}\left(\sum X^2_i\right)^2}{\alpha-2}=\sum X^2_i \Leftrightarrow \alpha = \frac{\left(n\overline{X}\right)^2(\alpha-2)}{\sum X^2_i} \text{ this is equation (2) for alpha }$$

Equating the 2 equations of $\alpha$ and solving for $\alpha$:

$$\frac{\left(n\overline{X}\right)^2(\alpha-2)}{\sum X_i^2}=\frac{(\alpha-1)n \overline{X}^2}{\sum X^2_i}$$

$$\Leftrightarrow (\left(n\overline{X}\right)^2(\alpha-2)=n\overline{X}^2(\alpha-1)\Leftrightarrow $$

$$\alpha\left(\left(n\overline{X}\right)^2-n\overline{X}^2\right)=2(n\overline{X}^2-n\overline{X}^2\Leftrightarrow \hat{\alpha}=\frac{2n-1}{n-1}$$

Well now that I have these, I don't know how they are the solutions of

$$\frac{1}{\hat{\alpha}(\hat{\alpha}-2)}=\left(\frac{n-1}{n}\right)\frac{S^2}{\overline{X}^2} \text{ and } \hat{k}=\frac{(\hat{\alpha}-1)\overline{X}}{\hat{\alpha}}$$

Did I solve for the estimators correctly? How do I go on from here?

Best Answer

I did not do all the calculations because it is only a matter to solve algebraic systems but I explain you how to do...

To calculate MoM's estimators, the first thing you have to do is to express your parameters in terms of population's moments.

I write $\mu_i$ meaning the $i^{th}$ population moment

Thus you start with

$$ \begin{cases} \mu=\frac{\alpha k}{\alpha-1}\\ \mu_2=\frac{\alpha k^2}{\alpha-2} \end{cases}\rightarrow\begin{cases} (\alpha-1)\mu=\alpha k\\ (\alpha-2)\mu_2=\alpha k^2 \end{cases}$$

Immediately from the first equation you get

$$k=\frac{\alpha-1}{\alpha}\mu$$

which immeditately shows you the first solution: the estimator of k is a function of the first moment and the other parameter. Now all you have to do is to substitute the first population's moment with the empirical one (the sample mean) and calculate the estimator of the other parameter in the same way.

Thus your solution is

$$\hat{k}_{MoM}=\frac{\hat{\alpha}-1}{\hat{\alpha}}\overline{X}_n$$

The other solution is a bit more complicated due to the algebraic passages but the method is the same...express $\alpha$ in terms of $\mu$ and $\mu_2$ observing that when you will find

$$\mu_2-\mu^2$$

which is population's variance you will substitute that expression with $\frac{1}{n}\Sigma_i X_i^2-\left(\frac{1}{n}\Sigma_i X_i\right)^2=\frac{1}{n}\Sigma_i[X_i-\overline{X}]^2=S_B^2$


Just FYK, your density is a known density: the Pareto. In this link you will surely find a lot of useful information that may help you


EDIT: In order to find also the second estimator, start with the original system

$$\begin{cases} k=\frac{\alpha-1}{\alpha}\mu\\ k^2=\frac{\alpha-2}{\alpha}\mu_2 \end{cases}$$

That is

$$(\alpha-1)^2\mu^2=\alpha(\alpha-2)\mu_2$$

Expand the expressions, group them, and in few basic algebraic passages you will get

$$\hat{\alpha}(\hat{\alpha}-2)=\frac{\overline{X}^2}{S_B^2}$$

Which is exactly the requested proof with the only difference that I used $S^2=\frac{1}{n}\Sigma_i [X_i-\overline{X}]^2$ which is the biased variance estimator.

Related Question