Regression Analysis – Multiple R-Squared and Adjusted R-Squared for One Variable

linearregression

I understand that adding variables leads to a better representation of random noise, thus a higher R^2, and thus, in multiple regression, it is adjusted for.

In simple linear regression with one variable, there should not be a difference, should there? However, I get different values that are always quite close but not identical.

Is that difference meaningful? Which value is right? For example:

Multiple R-squared: 0.9118, Adjusted R-squared: 0.9109

Best Answer

To answer your first question: if you have even a single predictor, the $R^2$ and adj. $R^2$ would be different. Check the adj. $R^2$ formula below:

$$\bar R^2 = 1 - (1 - R^2)\frac{n-1}{n-p-1}$$

You can see that having a single predictor ($p$) would change the denominator and the adj. $R^2$.

The difference between the two should be very small for a model with a single predictor (and large sample size). I don't think there is a right or wrong value here, and it is hard to tell whether the difference is meaningful or not without information on the analysis. In any case, which one to use depends on your objectives.

Related Question