Calculating MSE of MOM and MLE of a Uniform Distribution

maximum likelihoodprobability theorystatistical-inferencestatistics

Let $X_1, X_2, X_3$ be a random sample of size three from a $uniform(θ, 2θ)$ distribution, where $θ > 0$

I solved to get $\tilde{\theta}_{MoM}$ to be $\frac{2\bar{x}}{3}$.

Also, I got $\hat{\theta}_{MLE}$ to be $\frac{X_n}{2}$.

For the c part, I am trying to find the MSE for both MoM and MLE estimators. I used the formula $MSE(\theta) = Var(\theta) + (Bias(\theta))^2$

For the MoM estimator part, I am able to find the $MSE(\theta)$ to be $\frac{\theta^2}{27}$ but for the MLE part, I think I have to find the distribution of $X_n$ before I can be able to find variance and expected value.

Can someone help me with that, confirm if the MSE of the MoM estimator is also correct, and guide me on the MSE of the MLE estimator?

Best Answer

For your method of moments estimator $\frac23 \bar X$:

each $\textrm{Var}(X_i)=\frac{\theta^2}{12}$ so $\textrm{Var}(\bar X)=\frac19 3Var(X_i)= \frac{\theta^2}{36}$ so $\textrm{Var}(\frac23 \bar X)=\frac49 \frac{\theta^2}{36}= \frac{\theta^2}{81}$

and since $\frac23 \bar X$ is unbiased, this means the expected mean square error is $\frac{\theta^2}{81}.$


For your maximum likelihood estimator $\frac {X_{(3)}}{2}$:

The probability $X_{(3)} \le x$ is the probability all of $X_1,X_2,X_3 \le x$. This is $\frac{(x-\theta)^3}{\theta^3}$ so the density for $X_{(3)}$ is $3\frac{(x-\theta)^2}{\theta^3}$ between $\theta$ and $2\theta$.

With that you can calculate the expected mean square error $E\left[\left(\frac {X_{(3)}}{2}-\theta\right)^2\right]=\int\limits_\theta^{2\theta} 3\frac{(x-\theta)^2}{\theta^3}\left(\frac x2-\theta\right)^2\,dx = \frac{\theta^2}{40}.$

If you wanted to split it into variance and square of bias, it would be $\frac{3 t^2}{320} + \frac{t^2}{64}= \frac{\theta^2}{40}$. By comparison, $\frac47 X_{(3)}$ would be an unbiased estimator of $\theta$ and have variance and expected mean square error $\frac{3 t^2}{245}$, slightly below that of your method of moments estimator, while $\frac{35}{62} X_{(3)}$ would have an even lower expected mean square error of $\frac{3 \theta^2}{248}$.


Whether some other combination of $X_{(1)},X_{(2)},X_{(3)}$ could do better than this would be a longer exercise. It turns out that with $\hat \theta = \frac5{12}X_{(3)} + \frac5{24}X_{(1)}$ $($very slightly biased as its expectation is $\frac{95}{96}\theta)$ you can reduce the expected mean square error to $\frac{\theta^2}{96}.$ But even this is not optimal, as when $X_{(3)} > \frac{19}{10}X_{(1)}$ it gives an implausible $\hat \theta > X_{(1)}$. I came up with the complicated $ \frac{5 X_{(3)} X_{(1)} (2 X_{(1)}+X_{(3)}) (4 X_{(1)}^2+X_{(3)}^2)}{4 (16 X_{(1)}^4+8 X_{(3)} X_{(1)}^3+4 X_{(3)}^2 X_{(1)}^2+2 X_{(3)}^3 X_{(1)}+X_{(3)}^4)}$ which performed even better and a reasonably close approximation to this seems to be $\frac{1}{868} \left(736 X_{(3)} -80 X_{(1)} -131 \frac{X_{(3)}^2}{X_{(1)}}\right)$ both with an expected mean square error less than $\frac{\theta^2}{96.935}$.

Related Question