Solved – Is least square error related to mean square error

linearregressionterminology

Linear regression:
Is below 2 same –

  1. (least squares error) divided by number of data points
  2. MSE Mean Square Error.

Here 1 is used to figure out the regression line.
2 is used as evaluation metrix of the same line.(Of course RMSE is better one)

I know Least Square Error is used to BUILD the model
MSE is used to EVALUATE the model

For me, Mean Square Error = Least Square Error divided by N
This is where I am confused. We are same kind of calculation for BUILDING and EVALUATING. Of course, this should not be case. I need an explanation on both these terms.
On the whole, Mean Square Error can be calculated using LEAST Square Error mathematically or NOT.

Thanks

Best Answer

As you say yourself, we evaluate the performance of a linear regression with the (R)MSE. Surely, when we start with our regression analysis, we will want to maximise the performance of our analysis. So what we do is create an objective function that maximises the performance.

Maximising performance in linear regression means minimising the (R)MSE. So we try to find regression coefficients $\beta$ that minimise the MSE:

\begin{eqnarray} \arg\min_{\beta} MSE = \arg \min_{\beta} \underbrace{\frac{1}{N}\sum_{i = 1}^{N}(y - X\beta)^2}_{MSE} = \arg \min_{\beta} \underbrace{\sum_{i = 1}^{N}(y - X\beta)^2}_{\text{sum of least squares}} \end{eqnarray}

As you suspected, minimising the MSE and sum of least squares result in the same solution and may therefore be seen as equivalent.

Related Question