[Math] Showing that the MLE of $f(x\mid\theta)=\theta x^{\theta-1}$ is consistent.

statistics

Suppose $X_1,\ldots,X_n$ are IID RV with distribution
$$f(x\mid\theta)=\theta x^{\theta-1}, \;\;0<x<1,\,\,0<\theta<\infty.$$

I've found the MLE $\hat{\theta}$
$$\hat{\theta}=\frac{-1}{\frac{1}{n}\sum_{i=1}^n \log X_i}.$$

I've also found $E[Y_i]=\frac{-1}{\theta}$ and $\operatorname{var}[Y_i]=\frac{1}{\theta^2}$ where $Y_i=\log X_i$.

This is were everything goes down hill fast. Now I'm supposed to use this information to show that $\hat{\theta}$ is consistent, derive the distribution of $\hat{\theta}$ and check whether or not it's biased.

I know that for consistency I need to show $P(|\hat{\theta}-\theta|>\epsilon)\rightarrow 0$ as $n\rightarrow \infty$ but I'm not sure how that helps me here.

As for the distribution of $\hat{\theta}$, it looks like there's a Gamma distribution in there somewhere I just can't squeeze it out.

I really appreciate any help.


In light of JPi's comments I apply Slutky's theorem and have:
$$\hat{\theta}=g(\bar{Y})=-\bar{Y}^{-1}$$
Which by the LLN converges to $g(E[Y])$.

Best Answer

For consistency, use Slutsky's theorem, which says that if $\hat\theta$ is a consistent estimator of $\theta_0$ then $g(\hat\theta)$ is a consistent estimator of $g(\theta_0)$ provided that $g$ is continuous (some authors use a name other than Slutsky's).

For the bias question, think of the Jensen inequality.

If you need a limit distribution then you can think of using the delta method, which says that if $\sqrt{n} (\bar x - \mu) \stackrel{d}{\to} N(0,\sigma^2)$ then $\sqrt{n} ( g(\bar x) - g(\mu)) \stackrel{d}{\to} N(0,(g'(\mu))^2 \sigma^2)$, if $g$ is differentiable at $\mu$.

For the exact distribution you could try looking at the characteristic function of $\hat\theta$.

Related Question