Solved – Ridge Regression in R where coefficients are penalized toward numbers other than zero

bayesianrridge regression

Is it possible to penalize coefficients toward a number other than zero in a ridge regression in R?

For example, let's say I have dependent variable Y and independent variables X1,X2,X3, and X4. Because of the multicollinear nature of the ivs, ridge regression is appropriate. But say I'm fairly certain that the coefficient of X1 is near 5, X2 is near 1, X3 is near -1, and X4 is near -5.

Is there a ridge package and method in R where I can implement penalties on the coefficients of the ivs that penalize them toward those numbers instead of 0? I'd love to see an example in R with my example data, if possible. Thank you.

Best Answer

A simple way to do this is to subtract the "centering value" of the coefficient times its associated variable from the left-hand side. To go with your example,

$Y = \beta_1X_1 + \beta_2X_2 + \beta_3X_3 + \beta_4X_4 + e$

Assume the coefficient values should be centered at (5,1,-1,-5) respectively. Then:

$Y - 5X_1 -X_2 +X_3 +5X_4 = (\beta_1-5)X_1 + (\beta_2-1)X_2 + (\beta_3+1)X_3 + (\beta_4+5)X_4 + e$

and, redefining terms, you have:

$Y^* = \beta_1^*X_1 + \beta_2^*X_2 + \beta_3^*X_3 + \beta_4^*X_4 + e$

A standard ridge regression would shrink the $\beta_i^*$ towards 0, which is equivalent to shrinking the original $\beta_i$ towards the specified centering values. to see this, consider a fully-shrunk $\beta_4^* = 0$, then $\beta_4+5 = 0$ and therefore $\beta_4 = -5$. Shrinkage accomplished!

Related Question