Calculating the Method of Moments estimator

parameter estimationstatistics

Consider a random sample $X_1,\ldots , X_n$ that are independent and identically distributed according to some $f_X(\cdot|\alpha)$ where:
$$f_X(x|\alpha)=\frac{\text{exp}\{\alpha x-e^{\alpha}\}}{x!}$$
We also know $\alpha \in \mathbb{R}$, $\alpha$ is unknown, and E$[X]=e^{\alpha}$.

My (possibly flawed) understanding of method of moments is that we let the sample mean equal the first moment, i.e.: $\frac{1}{n}\sum_{i=1}^nX_i=\bar{X}=e^{\alpha}$, so our estimator $\hat{\alpha}_{MM}=\ln(\bar{X})$.

I'm doubting myself because when I then examine the bias which I define to be E$[\hat{\alpha}_{MM}]-\alpha$ I end up with $\ln(\bar{X})-\alpha$ which I can't seem to infer anything from. Where is my understanding of this topic falling short?

EDIT $1$: When analysing the bias I am assuming E$[\hat{\alpha}]=\text{E}[\ln(\bar{X})]=\ln(\bar{X})$ since once we have our random samples, then $\bar{X}$ is just a number and E$[c]=c$

Best Answer

you are right.

If you know that

$$\mathbb{E}[X]=\mu=e^\alpha$$

this means that

$$\alpha=\log \mu$$

thus using the Method of Moments you get

$$\hat{\alpha}_{\text{MM}}=\log \overline{X}_n$$

Related Question