Solved – Not standardizing outcome, standardizing predictors only

multiple regressionregressionregression coefficientsstandardization

I do understand the advantages of standardizing regression predictors to get standardized coefficients, in order to interpret the coefficients better. However, as I was reading multiple pages online, I figured that some people do standardize both predictor AND outcome to get standardized coefficients. It doesn't make sense to me. I am OK with standardizing predictors, but when the outcome is standardized too, we are predicting another value (not the actual Y). Is that right?

I also do accept the regression results when the predictors are STANDARDIZED and the outcome is CENTRED. However, not the regression results when the predictors are STANDARDIZED and the outcome is STANDARDIZED too.

Is that correct?

Best Answer

No, it's not really correct.

The questions about (and advantages and disadvantages of) standardizing variables are very similar for dependent and independent variables, with one rather questionable exception: The idea that standardizing independent variables makes it easier to compare the effects of one variable to another. This advantage is, in my opinion, somewhat illusory, since it depends on the range of data in your sample.

Although it's a matter of some contention, I am generally against standardizing variables. Variables themselves are, in my view, easier to interpret than standard deviations of variables - we often have an intuitive sense about variables themselves.

For example, if we were regressing weight on height, and left the units in pounds and inches (or kg and cm, if you're metric), then we have a sense of the meaning: "A height difference of 1 inch is related to a weight difference of 2 pounds" (or whatever).

Further, inches and pounds stay the same from one sample to another; standard deviations do not.