Does stationarity remove information from time series

arimaautocorrelationdata transformationstationaritytime series

I understand that most of the time series models work on the assumption that the underlying series is stationary(i.e, independent samples with constant mean and variance). But, won't stationarizing a series take away a lot of meaningful information/insights from it? How are the predictions valid then?

For example, if the sales value at x=5 is 500 before stationarizing, and after you stationarize, the value at x=5 is no longer representing the actual sales amount, does it? And because we train a model on this stationarized data, even the predictions do not give the actual sales value, right? How do we interpret time series predictions that are not in the original scale?

I'm just trying to understand this very fundamental concept. Please help me

Best Answer

Here are some points:

  1. A stationary series is not necessarily i.i.d. E.g. ARMA models are stationary for a wide range of parameter values and yield patterns far from i.i.d.
  2. The whole point with data transformations is that they can be done and then undone; absent that they would indeed be of limited use. E.g. instead of modelling an integrated process $y_t=\sum_{\tau=0}^{t}x_t$ directly, we can transform it into a stationary process $x_t$ by taking first differences, model it and then transform back by cumulatively summing $x_t$ as in the definition of $y_t$. If we want to predict $y_{t+1}$ given $y_t,y_{t-1},\dots,y_0$, we can work on transformed data $x_t,x_{t-1},\dots,x_0$, obtain the prediction $\hat x_{t+1}$ and then trivially obtain the prediction of interest as $\hat y_{t+1}=y_t+\hat x_{t+1}$.
Related Question