[Math] Method of moments estimator of $θ$ using a random sample from $X \sim U(0,θ)$

parameter estimationprobability theorystatisticsuniform distribution

I have to provide a Moment estimator for $θ$ of a random sample
$X_1,X_2,…,X_n$ which is given by $X ∼ U(0,θ)$ $(θ > 0)$ using Method of moments.

My first approach trying to solve the assignment was to find out that $U(0,θ)$ might be a continuous uniform distribution on the interval $(a,b)$, written as $X ∼ U(a,b)$.

When I have a random sample $X_1,X_2,…,X_n$ from $X$ than the moment number k is defined by
$$
E(X^k) = \frac{1}{n}\sum_{i=1}^{n}X_i^k
$$
In my distribution model I have one unknown parameter $θ$ so I set up an equation for the first moment.
$$
E(X) = \mu = \frac{1}{n}\sum_{i=1}^{n}X_i = \bar{X_n}
$$
The expected value for a uniform distribution $X ∼ U(a,b)$ is
$$
E(X) = \frac{a+b}{2}
$$
So by rearranging the first moment equation for $U(a,b)$, I want to find $b = θ$ where $a=0$.
$$
E(X) = \bar{X_n} = \frac{a+b}{2} \Rightarrow b = θ = 2 \bar{X_n}
$$
The estimator $\hat{θ}$ for $X ∼ U(0,θ)$ might be $2 \bar{X_n}$.

Is my solution correct or does anyone have corrections or feedback?
Thank you.


Edit 1 and Edit 2 and Edit 3:

I also have to answer if the estimator $2\bar{X_n}$ is (1) unbiased or (2) consistent.

(1) Apparently an estimator $\hat{θ}$ for a parameter $θ$ on a random sample size n is unbiased, if
$$
E_θ (\hat{θ_n}) = θ
$$

So after setting $\hat{θ_n} = 2\bar{X_n}$ into the formula, I get
$$
E_θ(2\bar{X_n}) = E\Bigg(2 \cdot \frac{1}{n}\sum_{i=1}^{n}X_i \Bigg) =
\frac{2}{n}\sum_{i=1}^{n}E(X_i) =
\frac{2}{n} \cdot n \cdot E(X) = 2 \cdot E(X)
$$

Rearranging the equation for the expected value delivers the same, as we can see
$$
\frac{a+b}{2} = E(X) \Rightarrow b = 2 \cdot E(X) + a \Rightarrow b = 2 \cdot E(X)
$$

See that $E_θ (\hat{θ_n}) = θ$ is fulfilled for $\hat{θ_n} = 2\bar{X_n}$ and $θ = 2 \cdot E(X)$. Hence $\hat{θ_n}$ is unbiased.

(2) The estimator $\hat{θ}$ is consistent for $θ$, if
$$
\hat{θ_n} \xrightarrow{P} θ \quad for \quad n \rightarrow \infty
$$

We could use convergence in probability from the Law of large numbers which is defined as:

If ${X_n}$ is a sequence of random variables and $X$ is another one, then $X_n$ converges in probability to $X$ for $\epsilon > 0$. We can write.
$$
\lim\limits_{n \to \infty}P(|X_n-X| \ge \epsilon) = 0
$$
We can set for $X_n = \hat{θ_n}$ and for $X = θ$. We want to show that
$$
\lim\limits_{n \to \infty}P(|\hat{θ_n}-θ| \ge \epsilon) = 0
$$
$$
P(|\hat{θ_n}-θ| \ge \epsilon) = P(|2\bar{X_n}-2 \cdot E(X)| \ge \epsilon)=
$$
$$
P\Bigg(\Big|2 \cdot \Bigg(\frac{1}{n}\sum_{i=1}^{n}E(X_i)\Bigg) – 2 \cdot E(X)\Big| \ge \epsilon \Bigg) =
P\Bigg(\Big|\frac{2}{n} \cdot n \cdot E(X_i) – 2 \cdot E(X)\Big| \ge \epsilon \Bigg)=
$$
$$
P(|2 \cdot E(X) – 2 \cdot E(X)| \ge \epsilon) = P(|0| \ge \epsilon)
$$
So we get for $\epsilon > 0$
$$
P(|0| \ge \epsilon) = 0
$$
The probability is $0$ because $0$ is not greater or equal a positive number $\epsilon$.
Therefore the estimator is consistent.

I am quite unsure if my solution for estimator consistency is right.
Can anybody review my approach, please?

Best Answer

The method of moments estimator is $\hat \theta_n = 2\bar X_n,$ and it is unbiased. It has a finite variance (which decreases with increasing $n$) and so it is also consistent; that is, it converges in probability to $\theta.$

I have not checked your proof of consistency, which seems inelegant and incorrect (for one thing, the $\epsilon$ disappears in the second line). You should be able to use a straightforward application of Chebyshev's inequality to show that $\lim_{n \rightarrow \infty}P(|\hat \theta_n - \theta| <\epsilon) = 1.$

However, $\hat \theta_n$ does not have the minimum variance among unbiased estimators. The maximum likelihood estimator is the maximum of the $n$ values $X_i$ (often denoted $X_{(n)}).$ The estimator $T = cX_{(n)},$ where $c$ is constant depending on $n,$ is unbiased and has minimum variance among unbiased estimators (UMVUE).

Both estimators are illustrated below for $n = 10$ and $\theta = 5$ by simulations in R statistical software. With a 100,000 iterations means and variances should be accurate to about two places. They are not difficult to find analytically.

 m = 10^5;  n = 10;  th = 5
 x = runif(m*n, 0, th)
 DTA = matrix(x, nrow=m)  # m x n matrix, each row a sample of 10
 a = rowMeans(DTA)        # vector of m sample means
 w = apply(DTA, 1, max)   # vector of m maximums
 MM = 2*a;  UMVUE = ((n+1)/n)*w
 mean(MM);  var(MM)
 ## 5.003658    # consistent with unbiasedness of MM
 ## 0.8341769   # relatively large variance
 mean(UMVUE); var(UMVUE)
 ## 5.002337    # consistent with unbiasedness of UMVUE
 ## 0.207824    # relatively small variance

The histograms below illustrate the larger variance of the method of moments estimator.

enter image description here