Least Squares – Difference Between Least Squares Method and Mean Squared Method for Error Calculation

least squareslinear modelmse

I think I am a little bit confused between the LSE (Least Squared Error) and the MSE (Mean Squared Error). So how can these two methods differ in calculating the error of the linear regression model?
And if available, please provide the mathematical equation of both methods.

Best Answer

You can think of least squares method as minimization of mean squared errors. In other words the latter is the subject of minimization of the former with respect to the parameters $\beta$: $$\min_\beta MSE(\beta)$$ Once you find the optimal parameters $\hat\beta$ then $MSE(\hat\beta)$ is your LSE