Solved – Linear regression for time-series prediction

lassoregressionridge regressiontime series

Say we have $N$ time series $X_t^i$ for $i=1…N$and we want to predict a separate time series $Y_t$. Let's consider the following model: $Y_t = \sum_{i} \beta_i X_{t-1}^i $

I am just trying to figure when such a model makes sense

  • Does this model have a name? Is there an interpretation in terms of random processes? Are there any known applications?
  • Does it make sense to solve the $\beta_i$ by using regression approaches (ridge, lasso, etc) using the training points $(<x_{t-1}^1… x_{t-1}^N>, y_t)$?

Best Answer

I'd look at this as a problem in least squared minimization. so you're trying to minimize:

$$ \langle \epsilon^2 \rangle_t= \langle ( Y^t - \sum_i \beta_i X^{t-1}_{i})^2 \rangle_t = \langle (Y_t - \vec{\beta} \cdot X^{t-1})^2 \rangle_t$$

I tend to interpret this type of problem as a Gaussian statistics problem since the solution only involves first and second moments. The idea is that there is a Gaussian joint distribution for $p(y, x_1, x_2 \dots)$ with an arbitrary correlation matrix; you are trying to estimate that correlation matrix, and then computing the conditional distribution $p(y \vert X)=p(y,X)/p(X)$.

In some contexts, this type of problem may be referred to as a Wiener Filtering problem.

Related Question