Asymptotic distribution of a Maximum Likelihood Estimator using the Central Limit Theorem

central limit theoremmaximum likelihoodparameter estimationstatistical-inferencestatistics

Let $(X_1,\dots,X_n)$ be a random sample from a population $X$ having probability density function
$$f(x;\vartheta)=\vartheta\,x^{\vartheta -1}\,I_{(0,1)}(x)$$
$$\vartheta>0\qquad\qquad I_{(0,1)}(x)=
\begin{cases}
1\qquad\text{if }x\in(0,1)\\
0\qquad\text{otherwise}
\end{cases}$$

Find:

  1. $\hat{\vartheta}_n$, the MLE (Maximum Likelihood Estimator) of the parameter $\vartheta$
  2. the approximate distribution of $\hat{\vartheta}_n$ for $n$ big

While I'm pretty confident on how to solve the first point, I'd like some advice on the second one.


1.

To find the MLE $\hat{\vartheta}_n$, the likelihood function is calculated:
$$\mathscr{L}(\underline{x};\vartheta) = \vartheta^n\,\left( x_1 \times\dots\times x_n \right)^{\vartheta -1} = \vartheta^n\,\left( \prod_{i=1}^n x_i \right)^{\vartheta -1} \qquad\qquad(0<x_i<1,\, \forall\,i=1,2,\dots,n)$$
The first derivative with respect to $\vartheta$ is
$$\frac{\partial\mathscr{L}(\underline{x};\vartheta)}{\partial\vartheta} = \vartheta^{n-1}\,\left( \prod_{i=1}^n x_i \right)^{\vartheta -1}\left[ n+\vartheta\,\log{\left(\prod_{i=1}^n x_i\right)}\right]$$
and $\hat{\vartheta}_n$ is obtained solving for
$$\frac{\partial\mathscr{L}(\underline{x};\vartheta)}{\partial\vartheta} = 0\quad \implies \quad \hat{\vartheta}_n = -\frac{n}{\log{\left(\prod_{i=1}^n x_i\right)}}$$
To be precise, it should now be checked that
$$\left. \frac{\partial^2\mathscr{L}(\underline{x};\vartheta)}{\partial\vartheta^2}\right|_{\vartheta=\hat{\vartheta}_n} < 0$$
In order to do so, the second derivative with respect to $\vartheta$ is calculated:
$$\frac{\partial^2\mathscr{L}(\underline{x};\vartheta)}{\partial\vartheta^2} = \underbrace{\vartheta^{n-2}\,\left( \prod_{i=1}^n x_i \right)^{\vartheta -1}}_{\Gamma}\underbrace{\left[ n(n-1)+2n\vartheta\log{\left(\prod_{i=1}^n x_i\right)}+\vartheta^2\log^2{\left(\prod_{i=1}^n x_i\right)}\right]}_{\Delta}$$
$\Gamma$ is always positive, hence we only need to evaluate
$$\left.\Delta\right|_{\vartheta=\hat{\vartheta}_n} = -n <0$$


2.

To obtain the approximate distribution of $\hat{\vartheta}_n$ for $n$ big (a.k.a. the asymptotic distribution of $\hat{\vartheta}_n$ for $n\to\infty$), I thought of applying Cramér's Theorem for the asymptotic normality of the MLE.
This assures that
$$\sqrt{n}\left(\hat{\vartheta}_n-\vartheta\right)\xrightarrow{d}\mathscr{N}(0,1/I(\vartheta))$$
where $I(\vartheta)$ is Fisher information, calculated after simple but tedious algebra:
$$I(\vartheta) = \mathbb{E}\left[\left(\frac{\partial}{\partial\vartheta}\log{f(x;\vartheta)} \right)^2\right] = \frac{1}{\vartheta^2}$$
MY QUESTION

While the procedure presented above to obtain the asymptotic distribution is completely general and should be correct (please tell me if it's not), I was wondering if there is any way to straightforwardly apply the Central Limit Theorem (CLT) to the variable
$$\hat{\vartheta}_n = -\frac{n}{\log{\left(\prod_{i=1}^n x_i\right)}}$$
in order to obtain the same result, since $\hat{\vartheta}_n$ can be almost re-written as the sum of (the $\log$ of) iid variables:
$$\hat{\vartheta}_n = -\frac{n}{\sum_{i=1}^n \log{x_i}}$$

Any idea/suggestion would be greatly appreciated!

Best Answer

As already stated, thanks to the Cramer's Theorem it is found that $$\sqrt{n}\left(\hat{\vartheta}_n-\vartheta\right)\xrightarrow{d}\mathscr{N}(0,\vartheta^2)$$ The same result can be found applying in a more straightforward manner the CLT and the Delta method.

The MLE is rewritten as $$\hat{\vartheta}_n = \left(\frac{\sum_{i=1}^n -\log{x_i}}{n}\right)^{-1} = \left(\frac{\sum_{i=1}^n y_i}{n}\right)^{-1} = (\bar{Y}_n)^{-1}$$ with $y_i := -\log{x_i}$ and $\bar{Y}_n := \frac{\sum_{i=1}^n y_i}{n}$.

The CLT assures that $\bar{Y}_n$ is asymptotically normal - since it is the sum of iid variables with finite variance (this will soon be proved) - and so that $$\sqrt{n}(\bar{Y}_n - \mu_y)\xrightarrow{d}\mathscr{N}\left(0,\sigma_y^2\right)$$ where $$\mu_y = \mathbb{E}(y) = -\int_0^1 log{(x)}\,\vartheta\,x^{\vartheta - 1}\,\,\mathcal{d}x = \frac{1}{\vartheta}$$ $$\mathbb{E}(y^2) = \int_0^1 log^2{(x)}\,\vartheta\,x^{\vartheta - 1}\,\,\mathcal{d}x = \frac{2}{\vartheta^2}$$ $$\sigma_y^2 = \mathbb{E}(y^2) - \left( \mathbb{E}(y) \right)^2 = \frac{1}{\vartheta^2}$$ Finding, in the end, that $$\sqrt{n}(\bar{Y}_n - \frac{1}{\vartheta})\xrightarrow{d}\mathscr{N}\left(0,\frac{1}{\vartheta^2}\right)$$ Now, applying the Delta method, also the following must be true: $$\sqrt{n}\left(\left(\bar{Y}_n\right)^{-1} - \left(\frac{1}{\vartheta}\right)^{-1}\right)\xrightarrow{d}\mathscr{N}\left(0,\frac{1}{\vartheta^2}\cdot\left[\left.\frac{d x^{-1}}{dx}\right|_{x=\frac{1}{\vartheta}} \right]^2\right) = \mathscr{N}\left(0,\frac{1}{\vartheta^2}\cdot \vartheta^4\right)$$ which can be rewritten as $$\sqrt{n}\left(\hat{\vartheta}_n - \vartheta\right)\xrightarrow{d}\mathscr{N}\left(0,\vartheta^2\right)$$

This is the result already found applying the Cramer's Theorem.


[many thanks to kimchi lover and even more to Ian for helping through the whole process]