Regression – Difference Between Regression Coefficients and Partial Regression Coefficients

multiple regressionregressionregression coefficientsterminology

I've read in Abdi (2003) that

When the independent variables are pairwise orthogonal, the effect of
each of them in the regression is assessed by computing the slope of
the regression between this independent variable and the dependent
variable. In this case, (i.e., orthogonality of the IV’s), the partial
regression coefficients are equal to the regression coefficients. In all
other cases, the regression coefficient will differ from the partial
regression coefficients.

However, the document did not previously explain what the difference between these two types of regression coefficients is.

Abdi, H. (2003). Partial regression coefficients. In Lewis-Beck M., Bryman, A., Futing T. (Eds.) (2003) Encyclopedia of Social Sciences: Research Methods. Thousand Oaks, CA: SAGE Publications.

Best Answer

"Partial regression coefficients" are the slope coefficients ($\beta_j$s) in a multiple regression model. By "regression coefficients" (i.e., without the "partial") the author means the slope coefficient in a simple (only one variable) regression model. If you have multiple predictor / explanatory variables, and you run both a set of simple regressions, and a multiple regression with all of them, you will find that the coefficient for a particular variable, $X_j$, will always differ between its simple regression model and the multiple regression model, unless $X_j$ is pairwise orthogonal with all other variables in the set. In that case, $\hat\beta_{j\ {\rm simple}} = \hat\beta_{j\ {\rm multiple}}$. For a fuller understanding of this topic, it may help you to read my answer here: Is there a difference between 'controlling for' and 'ignoring' other variables in multiple regression?