[Math] Proof of asymptotic normality of Maximum Likelihood Estimator (MLE)

central limit theoremmaximum likelihoodnormal distribution

I am a self-learner trying to understand the proof of asymptotic normality of Maximum Likelihood Estimator (MLE) from the notes at http://www.konstantinkashin.com/notes/stat/Maximum_Likelihood_Estimation.pdf (pp. 25).

I am having certaing difficulties understanding how this converges in distribution to normal distribution

$\sqrt{n}(\hat\theta_{MLE}-\theta_o)=-\frac{\sqrt{n}l'(\theta_o)}{l''(\tilde \theta)}\xrightarrow{d}N(0, \frac{nI_n(\theta_o)}{(-I_n(\theta_o))^2})=N(0,\frac{n}{In(\theta_o)})$

if it was stated that nominator converges to normal distribution and denominator converges in probability to Fisher information.

I would be thankful for a simple as possible (mathematical) explanation.

Best Answer

In order to combine the two results \begin{align} \sqrt{n} \ell'(\theta) &\overset{d}{\to} N(0, n I_n(\theta)) \\ \ell''_n(\tilde{\theta}) &\overset{a.s.}{\to} -I_n(\theta_0) \end{align} to arrive at your claim, you just need to apply Slutsky's theorem, which appears in the appendix of your linked notes.

Related Question