Solved – Central limit theorem and residuals

central limit theoremnormality-assumptionresiduals

I have often read that thanks to the CLT, the residuals of a model are asymptotically normal. This argument always seemed odd to me since CLT states that

The sum of a number of independent and identically distributed random variables are asymptotically distributed.

Since a residual is generally a difference and not a sum of iid random variables, how can we can claim that the CLT applies?

Best Answer

What i think you are referring to is that estimators are often asymptotically normal with increase of the sample size. It's a mathematically correct statement that follows from more general formulation of CLT, Lyapunov's CLT. It doesn't require having identical distribution of variables, just independence. For example, OLS estimator is $\hat{\beta}=\beta+P_{x}\epsilon$, where projection operator $P_{x}=(X^{T}X)^{-1}X^{T}$, vector of errors is $\epsilon$. In other words the difference from the actual $\beta$ is linear combination of errors, which are independently distributed. Hence by Lyapunov CLT this difference is asympyotically normally distributed.

In terms of what you literally wrote, i.e. "residuals of a model are asymptotically normal" it is not clear what asymptotic limit we are talking about here. There is a sense in which errors themselves could be approximately normal, i.e. when they are modeled as a sum of very large number of unknown factors. When this assumption is applicable, once again Lyapunov CLT suggests they are approximately normal. Again, i am not sure that is what you are asking, because strictly there is no asymptotical equality here. Also, i suspect there is another confused notion about squared residuals being distributed normally. It would be more optimal to ask the question more specifically.