Consistent estimator problem

convergence-divergenceestimationprobability theory

The Problem

Check if estimator $Y_n=max(X_1,X_2,…,X_n)$ where $X_1,X_2,…,X_n$ ~ $U[0,X]$ with parameter $X$ is consistent or unbiased. We also assume that $X_1,X_2,…,X_n$ are independent.

What I've done

Firstly i will check unbiased.

We want to show that $E(Y_n)=X$.

$P(Y_n \le t)=P(max(X_1,X_2,…,X_n) \le t)=P(X_1 \le t, X_2\le t,…,X_n \le t)=P(X_1\le t)P(X_2\le t)…P(X_n \le t)=\frac{t^n}{X^n}$

So now compute cumulative distribution function :

$$
F_{Y_n}(t)=
\begin{cases}
0, &for \;t\;\in\;(-\infty,0)\\
\frac{t^n}{X^n}&for\;t\in\;[0,X]\\
1&for\;t\;\in\;(X,+\infty)
\end{cases}
$$

The density function :

$$D_{Y_n}(t)=\begin{cases}
0, &for \;t\;\in\;(-\infty,0) \cup (X,+\infty) \\
\frac{nt^{n-1}}{X^n}&for\;t\in\;[0,X]\\
\end{cases}
$$

$E(Y_n)=\int_{0}^{X}t\cdot \frac{nt^{n-1}}{X^n}=X$

So the estimator is unbiased.

Now i have a little problem wtih checking if estimator is consistent.

Doubts

I see that if $n$ will be big then $max(X_1,X_2,…,X_n)$ will be very near to $X$. So my intuition tell's me that this is consistent estimator but i don't know how to prove it in form.

Best Answer

Your calculation of $E(Y_n)$ is wrong, at the very last calculus step. You can check that $P(Y_n<X) = 1$, so $E(Y_n)=X$ is immediately suspect. (By the "Lake Woebegone" principle.) The correct integral is $E(Y_n)=n/(n+1)$.

That mistake aside, your general plan for figuring out the density function for $Y_n$ and expressing its moment with an integral is sound. You should be able to get the variance of $Y_n$ the same way.

Related Question