If adjusted R squared is superior to R squared, then why do statistical software continue to report the latter? Is there any kind of situation when a researcher may prefer to use R squared instead of adjusted R squared?
Solved – Why to report R squared
r-squaredregression
Related Question
- Time Series Models – Problem with Using R-squared
- Solved – Adjusted R-squared: number of terms or independent variables
- Solved – Why report r-squared in Instrumental Variables Estimation
- Solved – Should we report R-squared or adjusted R-squared in non-linear regression
- Solved – pseudo-R squared and multicollinearity checks with beta regression
- Solved – Which is better: r-squared or adjusted r-squared
Best Answer
Under conditions for instance explained here, $R^2$ measures the proportion of the variance in the dependent variable explained by the regression, which is a natural measure. Adjusted $R^2$ does not have this interpretation, as it modifies the $R^2$ value.
So while adjusted $R^2$ has the indisputable advantage of not increasing automatically when the number of regressors goes up, you pay a price in terms of how you can interpret the measure.
Note I am not advocating the use of one or the other, just giving a possible reason for why people still use the standard $R^2$.