R – Why Not Use Orthogonal Polynomials When Fitting Regressions?

polynomialrregression

In general, I'm wondering if there it is ever better not to use orthogonal polynomials when fitting a regression with higher order variables. In particular, I'm wondering with the use of R:

If poly() with raw = FALSE produces the same fitted values as poly() with raw = TRUE, and poly with raw = FALSE solves some of the problems associated with polynomial regressions, then should poly() with raw = FALSE always be used for fitting polynomial regressions? In what circumstances would it be better not to use poly()?

Best Answer

Ever a reason? Sure; likely several.

Consider, for example, where I am interested in the values of the raw coefficients (say to compare them with hypothesized values), and collinearity isn't a particular problem. It's pretty much the same reason why I often don't mean center in ordinary linear regression (which is the linear orthogonal polynomial)

They're not things you can't deal with via orthogonal polynomials; it's more a matter of convenience, but convenience is a big reason why I do a lot of things.

That said, I lean toward orthogonal polynomials in many cases while fitting polynomials, since they do have some distinct benefits.