Solved – Which is better: r-squared or adjusted r-squared

machine learningmultiple regressionr-squaredregression

I just started to learn about the following statistical measures, r-squared and adjusted r-squared and was wondering why can't we use adjusted r-squared for every regression model considering the fact that it penalizes the model for useless variables, unlike the former. Is there any advantage of r-squared over adjusted r-squared in some conditions?

Best Answer

Adjusted $R^2$ is the better model when you compare models that have a different amount of variables.

The logic behind it is, that $R^2$ always increases when the number of variables increases. Meaning that even if you add a useless variable to you model, your $R^2$ will still increase. To balance that out, you should always compare models with different number of independent variables with adjusted $R^2$.

Adjusted $R^2$ only increases if the new variable improves the model more than would be expected by chance.