Solved – What are the benefits and disadvantages to Lasso, Ridge, Elastic Net, and Non Negative Garrotte Regularization techniques

linear modelmachine learningregressionregularization

I am implementing these four regularization techniques for linear regression of stock data in MATLAB but i noticed elastic net is just the sum of Ridge and Lasso, and i dont full understand how exactly Non Negative Garrotte Works as a regularization technique.

How does Garrotte work and why wouldnt you just always use elastic net over lasso and ridge? (Aside from computation complexity)

Best Answer

I don't know about the Garrote, but LASSO is preferred over ridge regression when the solution is believed to have sparse features because L1 regularization promotes sparsity while L2 regularization does not, and Elastic Net is preferred over LASSO because it can deal with situations when the number of features is greater than the number of samples, and with correlated features, where LASSO behaves erratically. The additional L2 terms as a preconditioner or stabilizer by introducing strong convexity, but you'll have to read about convex optimziation to appreciate that. I think the original paper by Hastie and Zou explains all this clearly, and is worth reading.