Statistics – How to Show Estimator Consistency

consistencyconvergenceself-study

Let $\hat{\theta}_n= -\frac{n}{\sum_{i=1}^n \log(X_i)}$, where $X_i$ are i.i.d. samples from distribution with pdf $\theta x^{\theta-1}$ for $x \in (0,1)$. How to prove that $\hat{\theta}_n$ is consistent estimator of $\theta$.

Best Answer

Probability limits distribute, so

$$\text{plim}\left (\frac {1}{\hat \theta}\right) = \frac {1}{\text{plim}\hat \theta}$$

Also, $$\hat{\theta}_n= -\frac{n}{\sum_{i=1}^n \log(X_i)} \implies \frac {1}{\hat \theta} = -\frac 1n\sum_{i=1}^n \log(X_i) $$

The sample is i.i.d. and so ergodic and so

$$\text{plim}\left (\frac {1}{\hat \theta}\right) = \text{plim} \left(-\frac 1n\sum_{i=1}^n \log(X_i)\right) = E\left[-\log(X_i)\right]$$

So you just have to find the distribution of $Y = -\log(X_i)$ as suggested in a comment, and calculate its expected value.

Related Question