Solved – Robust standard errors in multiple regression

regressionrobustrobust-standard-errorspss

I use Andrew F. Hayes' macro for SPSS (HCREG at http://www.afhayes.com/spss-sas-and-mplus-macros-and-code.html) to perform multiple regression analyses with robust standard errors.

The information I get in the output is limited compared to the one SPSS provides when applying multiple regression. For example, there are no standardized coefficients (Beta) for predictors and there is no adjusted R-squared.

Can I assume that the standardized coefficients will be the same as in the model without robust standard errors? If not, does anyone know how can I go about computing them by hand?

I have the same question for the adjusted R-squared.

Any help would be greatly appreciated!
Thank you very much in advance.

Best Answer

The adjustments are only to the standard errors of the regression coefficients, not to the point estimates of the coefficients themselves. So you can gather the requested statistics from the traditional OLS output in SPSS. The Hayes and Cai, 2007 paper elaborates on this, as well.

To note, perhaps it is a difference between fields but I almost always see these types of standard errors referred to by their originators (Huber, White and Eicker). There are other types of "robust" estimates and standard errors though (e.g. estimated by the jack-knife or bootstrapping). Sometimes these other estimators do have different point estimates for the coefficients and the standard errors of the coefficients (not always though).