Linear Regression – Minimizing MAD in Linear Regression Using Sklearn

multiple regressionregressionscikit learn

The standard sklearn linear regression class finds an approximated linear relationship between variate and covariates that minimises the mean squared error (MSE). Specifically, let $N$ be the number of observations and let us ignore the intercept for simplicity. Let $y_j$ be the variate value of the $j$-th observation and $x_{1,j}, \dots, x_{n,j}$ be the values of the $n$ covariates of the $j$-th observation. The linear relationship is of the form
$$ y = \beta_1 x_1 + \dots \beta_n x_n;$$
where the coefficients $\beta_1, \dots, \beta_n$ are given by
$$\beta_1, \dots, \beta_n = \underset{\tilde\beta_1, \dots, \tilde\beta_n}{\mathrm{argmin}} \left( \sum_{j = 1}^N \left( y_j – \tilde\beta_1x_{1, j} – \dots -\tilde\beta_nx_{n, j}\right)^2 \right).$$

I now wish to find the coefficients that minimise the mean absolute deviation (MAD) instead of the mean squared error. Namely, I want the coefficients given by
$$\beta_1, \dots, \beta_n = \underset{\tilde\beta_1, \dots, \tilde\beta_n}{\mathrm{argmin}} \left( \sum_{j = 1}^N \left| y_j – \tilde\beta_1x_{1, j} – \dots -\tilde\beta_nx_{n, j}\right| \right).$$

I understand that, in sharp contrast to the MSE case, the lack of differentiability of the absolute value function at $0$ implies there is no analytic solution for the MAD case. But the latter is still a convex optimisation problem, and, according to this answer, it can be easily solved by means of linear programming.

Is it possible to implement this linear regression in sklearn? What about using other statistics toolkits?

Best Answer

The expected MAD is minimized by the median of the distribution (Hanley, 2001, The American Statistician; see also Why does minimizing the MAE lead to forecasting the median and not the mean?). Therefore, you are looking for a model that will yield the conditional median, instead of the conditional mean.

This is a special case of , specifically for the 50% quantile. Roger Koenker is the main guru for quantile regression; see in particular his book Quantile Regression.

There are ways to do quantile regression in Python. This tutorial may be helpful. If you are open to using R, you can use the quantreg package.

Related Question