Solved – Linear regression with Laplace errors

laplace-distributionregression

Consider a linear regression model:
$$
y_i = \mathbf x_i \cdot \boldsymbol \beta + \varepsilon _i, \, i=1,\ldots ,n,
$$
where $\varepsilon _i \sim \mathcal L(0, b)$, that is, Laplace distribution with $0$ mean and $b$ scale parameter, are all are mutually independent. Consider a maximum likelihood estimation of unknown parameter $\boldsymbol \beta$:
$$
-\log p(\mathbf y \mid \mathbf X, \boldsymbol \beta, b) = n\log (2b) + \frac 1b\sum _{i=1}^n |\mathbf x_i \cdot \boldsymbol \beta – y_i|
$$
from which
$$
\hat{\boldsymbol \beta}_{\mathrm {ML}} = {\arg\min }_{\boldsymbol \beta \in \mathbb R^m} \sum _{i=1}^n |\mathbf x_i \cdot \boldsymbol \beta – y_i|
$$

How can one find a distribution of residuals $\mathbf y – \mathbf X\hat{\boldsymbol \beta}_{\mathrm {ML}}$ in this model?

Best Answer

The residuals (actually called errors) are assumed to be randomly distributed with a double-exponential distribution (Laplace distribution). If you are fitting this x and y data points, do it numerically. You first calculate beta-hat_ML for these points as a whole using the formula you posted above. This will determine a line through the points. Then subtract each point's y value from the y value of the line at that x value. This is is the residual for that point. The residuals of all points can be used to construct a histogram that will give you the distribution of the residuals.

There is a good mathematical article on it by Yang (2014).

--Lee

Related Question