Solved – the regression coefficient of a constant predictor in linear regression

regression

I'm performing a multiple linear regression on 9 predictors and 1 response variable. It so happens that of the 9 predictors 3 of them have constant values, i.e. for each of these predictors, the values are constant and are the same for each sample.

I would expect that the resulting coefficients for these variables would be zero; however this is only the case for 2 of the three constant variables. In the third case the coefficient is non-zero even though the variable has zero variance in the model (I double-checked this).
Should this be possible?

Best Answer

Sure – it just amounts to adding a constant to the model in a different way:

$$\beta^T X + c = \sum_i \beta_i x_i + c = \sum_{i \notin C} \beta_i x_i + \left( \sum_{i \in C} \beta_i x_i + c \right), $$

where $C$ is the set of constant predictors, so that the parenthesized term is a constant.

This will cause issues with multicollinearity, and if you're doing simple linear regression the distribution among the $\beta_i$ for constant predictors will be arbitrary, but it's still a well-defined model. If you do any regularization, though, these $\beta_i$ should all become zero.