OLS vs Maximum Likelihood – Comparison in Linear Regression under Normal Distribution

least squaresmaximum likelihoodnormal distributionregression

I found that for a simple linear regression model, both OLS and maximum likelihood method (assuming Normal distribution) give the same output (parameter values). From this, can we say that OLS also make implicit assumption about the Normal distribution or vice-versa? I am not interested in why both produce same value but which one make less stringent assumption about the data?

Best Answer

OLS does not make a normality assumption for the model errors. OLS can be used under different distributional assumptions and the estimator will still make sense as the minimum variance linear unbiased estimator.

Maximum likelihood (ML) can also accommodate different distributions, but the distribution has to be chosen in advance. If the actual distribution appears to be different from the assumed distribution, ML estimator will no longer make sense as the estimator that maximizes the joint probability density of the data.

Thus we can say that in a particular application ML makes a more stringent assumption about the model errors than OLS does.

Related Question