I have a univariate time series data. I want to do a multi-step prediction. I came across this question which explains time series one step prediction.
but I am interested in multi-step ahead prediction. For example, a typical univariate time series data looks like this:
time value
---- ------
t1 a1
t2 a2
..........
..........
t100 a100.
Suppose, I want 3 step ahead prediction. Can I frame my problem like:
TrainX TrainY
[a1,a2,a3,a4,a5,a6] -> [a7,a8,a9]
[a2,a3,a4,a5,a6,a7] -> [a8,a9,a10]
[a3,a4,a5,a6,a7,a8] -> [a9,a10,a11]
.................. ...........
.................. ...........
(I am using keras and tensorflow as the backend.)
The first layer has 50 neurons and expects 6 inputs; the hidden layer has 30 neurons; and the output layer has 3 neurons (i.e., outputs three time series values).
model = Sequential()
model.add(Dense(50, input_dim=6, activation='relu',kernel_regularizer=regularizers.l2(0.01)))
model.add(Dense(30, activation='relu',kernel_regularizer=regularizers.l2(0.01)))
model.add(Dense(3))
model.compile(loss='mean_squared_error', optimizer='adam')
model.fit(TrainX, TrainY, epochs=300, batch_size=16)
Is this a valid model? Am I missing something?
Best Answer
This seems reasonable---it is a rolling time window similar to this question. In terms of predicting 3 time steps instead of 1, it's fine---you can just output a vector, as you've done.
Better suited to this problem is a recurrent neural network however.
This being said, the window approach sometimes performs better---although probably a correctly fit LSTM would always perform as good or better, it is easier to fit the window approach from the human modelers perspective. Also, sometimes CNNs are used for this kind of analysis and give comparable performance to RNNs, but they also have parameter sharing.