[Math] Obtaining Consistent Estimators Based on Uniform Distribution

estimationestimation-theoryparameter estimation

Let $X_1,X_2,…,X_n$ be a random sample distributed according to the uniform distribution $\mathcal{U}(\theta,\theta+1)$.

Let $U$ be the largest sample, and $V$ be the smallest sample.

Then, which of the following will be consistent estimator of $\theta$:

  1. $U$
  2. $V$
  3. $2U-V-2$
  4. $2V-U+1$

I have studied that for a distribution with some parameter taking values in $[a,b]$, then the Maximum of the samples will be consistent for $b$ and minimum of Samples will be conistent $a$. So, following this Theorem first and second options seems true.

Also, function of consistent estimators are also Consistent. Hence, the third and fourth options are also Consistent. So, I conclude that all the options given here will be consistent. I don't know whether I have used correct rationale behind this. Please tell me if I am doing anything incorrect here. Thanks

Best Answer

$U$ is consistent for $\theta + 1$ (the maximum of the samples) ; therefore $U$ is not consistent for $\theta$ but $U-1$ is.

$V$ is consistent for $\theta$ (the minimum of the samples)

$2U-V-2$ converges in probability to $2\theta+2-\theta-2 = \theta$ so it is consistent for $\theta$

$2V- U +1$ converges in probability to $2\theta-\theta-1+1 = \theta$ so it is consistent for $\theta$

(where the continuous-mapping theorem was used for the last two conclusions)