MLE of $\theta$ when $X_1,\ldots,X_n $ are i.i.d with $P_\theta(x; \theta)=2x/\theta^2$ for $0\le x \le\theta$

maximum likelihoodparameter estimationprobabilityprobability distributionsstatistics

Assume the standard situation, that is, let $X_1, \ldots , X_n$ be independent and identically distributed with $X_k ∼ P_\theta(x; \theta)$ , where $P_\theta(x; \theta) = 2x/\theta^2$ if $0\le x \le\theta$ and $0$ otherwise.

It is required to estimate $\theta$. Show that the maximum likelihood estimator for $\theta$ is $\hat{\theta} = \max{[X_1, \ldots , X_n]}$ and then show that the cumulative distribution function of $\hat{\theta}$ is $F_\theta(z) = z^{2n}/\theta^{2n}$.

Here's what i did so far:

Maximum Likehood estimator:
$L_x(θ) = \prod_{i = 1}^{n} P_θ(x_k)$
Here we have $P_θ(x_1,…,x_n;θ) = P_θ(x_1,θ).P_θ(x_2,θ)…P_θ(x_n,θ)$

Likehood = $L_{x,θ}(θ) =P_θ(x_1,θ).P_θ(x_2,θ)…P_θ(x_n,θ)= 2x_1/θ^2. 2x_2/θ^2.. 2x_n/θ^2 = [2^n.\prod_{i = 1}^{n}x_i]/θ^{2n}$

Log-likehood: $\sum_{i = 1}^{n}log(P_θ(x_1,…x_n;θ))= \sum_{i = 1}^{n}log(2x_i/θ^2)$

Is this correct so far? Im still not sure how to get to $\hat{θ} = max{[X_1, . . . , X_n]}$

As for the the cumulative distribution part to show $F_θ(z) = z^{2n}/θ^{2n}$:

$F(z) = P(max({x_k})<z) = P(x_1<z).P(x_2<z)..P(x_n<z) = 2x_1/θ^2.2x_2/θ^2…2x_n/θ^n = 2^n.\prod_{i = 1}^{n}x_i/θ^2n$

Not sure if this is correct. Would really appreciate some help.

Edit: From the answers below, we can deduce the the estimator is biased. What estimator would be unbiased? How can i find it?

Best Answer

Given a sample $x\equiv \{x_i\}_{i=1}^n$, the likelihood is $$ L(\theta\mid x)=\left(\frac{2}{\theta^{2}}\right)^n\prod_{i=1}^n x_i \times1\{\theta\ge M(x),m(x)\ge 0\}, $$ where $M(x):=\max_{1\le i\le n}x_i$ and $m(x):=\min_{1\le i\le n}x_i$. The indicator suggests that $\hat{\theta}_n(x)\ge M(x)$ ($\because$ $L=0$ otherwise). However, taking values larger than $M(x)$ decreases $L$ because of the first term (assuming that $m(x)> 0$). Thus, $\hat{\theta}_n(x)= M(x)$.


As for the distribution of $\hat{\theta}_n$, for $z\in [0,\theta]$, $$ F(z)=\mathsf{P}(\hat{\theta}_n\le z)=\prod_{i=1}^n\mathsf{P}(X_i\le z)=\prod_{i=1}^n \left(\frac{z}{\theta}\right)^{2}=\left(\frac{z}{\theta}\right)^{2n}. $$


Since $\mathsf{E}X_i=2\theta/3$, examples of an unbiased estimator are $$ \hat{\theta}_n'=\frac{3}{2}\times \frac{1}{n}\sum_{i=1}^n x_i, \quad \hat{\theta}_n''=\frac{2n+1}{2n}\hat{\theta}_n. $$