[Math] Is the MLE strongly consistent and asymptotically efficient for exponential families

convergence-divergenceprobabilitystatistics

It is known that the Maximum Likelihood Estimator (MLE) is strongly consistent and asymptotically efficient under certain regularity conditions. By strongly consistent I mean that $\hat{\theta}_{MLE} \rightarrow \theta$ almost surely. By asymptotically efficient I mean that $\sqrt{n}(\hat{\theta}_{MLE}-\theta)\rightarrow N(0,I^{-1}(\theta))$ in distribution.

These regularity conditions are cumbersome to check so I was wondering if there is a general and easy to check case for when the regularity conditions hold. For example, do these regularity conditions always hold for exponential families?

I am not asking anyone to prove this, I am just wondering if someone knows the answer.

Regularity Conditions for Asymptotic Efficiency: http://en.wikipedia.org/wiki/Maximum_likelihood#Asymptotic_normality

Regularity Conditions for Strong Consistency: http://en.wikipedia.org/wiki/Maximum_likelihood#Consistency

Best Answer

In a paper I read a while ago, which can be found here, the authors propose a set of conditions of eigenvalues which are weaker than the usual regularity conditions. They show that under this set of conditions, the usual MLE are strongly consistent for (nonlinear) exponential families.