Solved – How to get the value of Mean squared error in a linear regression in R

errorrregression

For a linear regression model obtained by the R function lm, I would like to know if it is possible to obtain the Mean Squared Error by a command.

I had the following output of an example:

> lm <- lm(MuscleMAss~Age,data)
> sm<-summary(lm)
> sm

Call:
lm(formula = MuscleMAss ~ Age, data = data)

Residuals:
     Min       1Q   Median       3Q      Max 
-16.1368  -6.1968  -0.5969   6.7607  23.4731 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept) 156.3466     5.5123   28.36   <2e-16 ***
Age          -1.1900     0.0902  -13.19   <2e-16 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 8.173 on 58 degrees of freedom
Multiple R-squared:  0.7501,    Adjusted R-squared:  0.7458 
F-statistic: 174.1 on 1 and 58 DF,  p-value: < 2.2e-16

Is Multiple R-squared the same as sum of square error? If the answer is no, could someone explain the meaning of Multiple R-squared?

Best Answer

The multiple R-squared that R reports is the coefficient of determination, which is given by the formula

$$ R^2 = 1 - \frac{SS_{\text{res}}}{SS_{\text{tot}}}.$$

The sum of squared errors is given (thanks to a previous answer) by sum(sm$residuals^2).

The mean squared error is given by mean(sm$residuals^2). You could write a function to calculate this, e.g.:

mse <- function(sm) 
    mean(sm$residuals^2)