Consistency of MLE for $\theta$ where $f(x \mid \theta) = \frac{2x}{\theta^2}$

maximum likelihoodorder-statisticsprobabilityprobability distributionsstatistical-inference

Let $X= (X_1,…,X_n)$ be a random sample having PDF $f(x \mid \theta) = \frac{2x}{\theta^2}, 0 \le x \le \theta, \theta > 0$.

Find the MLE of $\theta $and show it is consistent.

I found the MLE using the likelihood function $L(\theta) = \frac{2^n}{\theta^{2n}}\Pi x_i1\{x_i \le \theta\} $

To get that the MLE will be $\hat{\theta} = \max X_i$. This gives that we will be working with order statistics so $\hat{\theta} := Y_n$ will have pdf $nf(x)[F(x)]^{n-1}$ which turns out to be $\displaystyle \frac{2nx^{2n}}{\theta^{2n}}$.

To show consistency we need to show $Y_n$ converges to $\theta$ in probability so I have been working with:

let $\epsilon \in (0,1)$ and $\displaystyle P(\theta – \epsilon < Y_n < \theta) = \int_{\theta-\epsilon}^\theta \frac{2n}{\theta^{2n}}x^{2n}dx = \frac{2n}{\theta^{2n}(2n+1)}\bigg (\theta^{2n+1} – (\theta-\epsilon)^{2n+1} \bigg)$

Which needs to go to zero as $n$ goes to infinity but that is not what I am getting. EDIT: This probability to should going to $1$. I made a mistake as pointed out below.

Am I making a mistake in calculating this?

Best Answer

The correct density for the maximum order statistic $\hat \theta = X_{(n)}$ is, by your own formula, $$f_{\hat \theta} (x) = n f(x) (F(x))^{n-1} = n \cdot \frac{2x}{\theta^2} \cdot \left(\frac{x^2}{\theta^2}\right)^{n-1} = \frac{2n x^{2n-1}}{\theta^{2n}}.$$ Therefore, $$\Pr[\theta - \epsilon < \hat \theta < \theta] = 1 - (1 - \epsilon/\theta)^{2n}.$$ For any $\epsilon \in (0, \theta)$, $0 < 1 - \epsilon/\theta < 1$, hence the limiting probability is $1$ as $n \to \infty$. But this is exactly what you want; you do not want this probability to tend to $0$ as you claim. You seem to be confused about what constitutes convergence in probability as it applies to this question. Formally, we say that $\hat \theta \to \theta$ in probability if $$\lim_{n \to \infty} \Pr[|\hat \theta - \theta| > \epsilon] = 0.$$ Note the direction of the inequality.