Solved – Which neural network architecture for time series classification

classificationneural networkstime series

I have time series consisting of 15 points of time, each containing around 15 values/features. Each time series is one sample, I have thousands of samples. Possible output is either 0 or 1, so binary classification.

Currently I feed in all 15×15 = 225 values = 225 inputs (per time series), without distinguishing or weighing them in time or any other aspect, in a normal backprop net (3 hidden layers, but less also works, Matlab Patternnet, scaled conjugate gradient) simultaneously and get best results (easily hits desired performance and gradient). Whenever I feed them in as vectors, e.g. 15 inputvectors with 15 values each, which would represent the time series way better, results get way worse. This is the first problem, I do not understand.

Another problem is, that with feeding them in just as 225 equal inputs, information about position in time series is not really contained. I hope I could use this information to increase performance of my neural net by reaching a higher level of abstraction and preventing overfitting.

When X is the input matrix for training and T the target Data, then it looks like this in case 1 (just put alls 225 values in equally):

X: 225×30000 double, T = 1×300000 double

In case 2 it would look like this:

X: 25×30000 cell with each cell containing 25×1 double, T= 1×30000 cell with each cell containing 1 double

Q1: Is case 2 the correct format as input?
Q2: Should not it have the same result than case 1?

Why does that make such a big difference in performance?

Best Answer

Recurrent neural networks are the best suited for time-series analysis but they can be somewhat cumbersome to train in practice. You can try couple of layers of convolutions in time direction (does it make sense to do the other direction also? E.g. frequency dimension would). This should be quite a bit faster than a regular NN and would allow you to add a couple of layers increasing model complexity.

Related Question