[Math] Method of moments estimator for $\theta$

parameter estimationstatistical-inferencestatistics

Let $ X_1,X_2,…,X_n $ be a random sample from a discrete distribution with probability mass function given by
$P(X=0)=\dfrac{1-\theta}{2};P(X=1)=\dfrac{1}{2};P(X=2)=\dfrac{\theta}{2}$;$0\leq\theta\leq1$

Find the method of moments estimator for $\theta$;

I calculated $E(X)= 0 \cdot\dfrac{1-\theta}{2} + 1 \cdot\dfrac{1}{2}+ 2\cdot\dfrac{\theta}{2}=\theta+\dfrac{1}{2}$

$\bar x=\theta+\dfrac{1}{2}$

$\bar x+\dfrac{1}{2}= {\hat\theta } $

My solution doesnt matches up with my material. Did i do any mistake here? Please someone tell me.

Best Answer

The method of moments consists in solving the system of equations made by the equality of empirical and theoretical moments. If we restrict us to the first moment (the expected value), we get indeed $\Bbb E X_1 = 1/2 + \theta$, which we equate to the empirical mean $\overline{X}_n = \frac{1}{n}\sum_{i=1}^n X_i$. Thus, the following estimator $\hat\theta_n$ is obtained for $\theta$: $$ \hat\theta_n = \overline{X}_n - \frac{1}{2} = \frac{1}{n}\sum_{i=1}^n \left(X_i - \frac{1}{2}\right) . $$ Using the law of large numbers, one shows that is estimator is consistent (convergence $\hat\theta_n {\to} \Bbb \theta$ in probability). Since this estimator is unbiased (expected value $\Bbb E\hat\theta_n = \theta$), the quadratic risk is equal to the estimator's variance, i.e. $\text{var}\hat\theta_n = \frac{-\theta^2 + \theta + 1/4}{n}$.

Related Question