Prove this is a consistent estimator

order-statisticsparameter estimationprobabilitystatistics

Given uniform distribution with parameters $\alpha$, and $\beta = \alpha + 1$, and let $Y_1$, the first order statistic. Show that $Y_1 – \frac{1}{n+1}$ is a consistent estimator of the parameter $\alpha$.

My attempt: I want to prove this by using the definition: $\lim_{n \to \infty} P(|\hat \theta_n – \theta|<c)=1$

First, I derive the sampling distribution of $Y_1$ when $\alpha <y_1 < \alpha + 1$: $n(\alpha +1-y_1)^{n-1}$ Then
$$
\begin{align*}
P\left( |Y_1-\frac{1}{n+1}-\alpha|<c\right) &= P\left(\frac{1}{n+1}+\alpha -c < Y_1 < \frac{1}{n+1}+\alpha +c\right) \\
&=\int_{\frac{1}{n+1}+\alpha -c}^{\frac{1}{n+1}+\alpha +c} n(\alpha +1-y_1)^{n-1} dy_1 \\
&= \left(1+c-\frac{1}{n+1} \right)^n – \left(1-c-\frac{1}{n+1}\right)^n
\end{align*}
$$

But I don't know how to derive this limit when $n$ goes to infinity.

Best Answer

If $c>0$, then when $n$ is sufficantly large such that $1/(n+1)<c/2$, we conclude $$ \lim_{n\to \infty}(1+c-\frac{1}{n+1})^{n}\ge\lim_{n\to \infty}(1+c/2)^{n}=+\infty $$ Similarily, $$ 0\le \lim_{n\to \infty}(1-c-\frac{1}{n+1})^n\le \lim_{n\to \infty}(1-c)^n=0 $$ Actually, $$ P(|Y_1-\frac{1}{n+1}-\alpha|<c)=\int_{\max(1/(n+1)+\alpha-c,\alpha)}^{\min(1/(n+1)+\alpha+c,\alpha+1)}n(\alpha+1-y_1)^{n-1}dy_1 $$ Suppose $n$ is sufficantly large such that $1/(n+1)<\min(1-c,c)$. We obtain $$ P(|Y_1-\frac{1}{n+1}-\alpha|<c)=\int_{\alpha}^{1/(n+1)+\alpha+c}n(\alpha+1-y_1)^{n-1}dy_1=1-(1-c-\frac{1}{n+1})^n=1 $$