Solved – Does the second moment estimator of the uniform distribution parameter have the same properties as that of the first moment

method of moments

For independent and identically distributed samples $[y_1,…,y_m]$ where $y$ is uniformly distributed between $[0,\theta]$ with $0 \lt \theta \lt \infty$, finding the method of moments estimator for $\theta$ is very straightforward using the first moment, $E[Y]$, and its natural estimator.

I am interested in knowing whether it possible to use the natural estimator for the second moment, $\frac{1}{m}\sum_{i=1}^{m}y_i^2$, to arrive a the same estimate of $\theta$ and whether or not its properties are the same as the estimator using the first moment.

Since $Var(Y) = \frac{\theta^2}{12} = E[Y^2] – E^2[Y] = E[Y^2] – \frac{\theta^2}{4}$ we have that $E[Y^2] = \frac{\theta^2}{3}$.

Then using the method of moments theory and the known support for $\theta$, the invertible function operating on $E[Y^2]$ to produce $\theta$ is $\sqrt{3E[Y^2]}$. Because we do not know the exact value of $E[Y^2]$, we use its natural estimator given above to produce the estimate. $$\hat{\theta} = \sqrt{\frac{3}{m}\sum_{i=1}^{m}y_i^2}$$

I would also like to show whether or not this is biased, but I am not sure if the way I would like to go about it is correct. I have that… $$E[\hat{\theta}] = \sqrt{\frac{3}{m}} E\left[\sqrt{\sum_{i=1}^{m}y_i^2}\right] $$

$$\sqrt{\frac{3}{m}}E\left[\sqrt{\sum_{i=1}^{m}y_i^2}\right] = \sqrt{\frac{3}{m}}\sqrt{\frac{m\theta^2}{3}} = \theta$$

Particularly, I am unsure of whether the following equality holds which I see as being the only way that the estimator could be unbiased.
$$E\left[\sqrt{\sum_{i=1}^{m}y_i^2}\right] = \sqrt{\frac{m\theta^2}{3}}$$

Best Answer

There's a particular result known as Jensen's inequality which relates $E(g(X))$ to $g(E(X))$ (Mathworld, Wikipedia)

It comes in two flavors, one for convex functions and one for concave functions. Equality will only hold when the variance of random variable in the expectation is 0.

You can use it to show that your estimator there must be biased (and in which direction).

Alternatively, can you show that $E[X^2]-E(X)^2\geq 0$?

(can you figure out when it's 0?)

Can you then see a way to show that if $E[Y^2]=\theta^2\!/3$, then the estimator you consider must be biased?