Solved – Simple Neural Network for time series prediction

backpropagationneural networkspredictiontime series

I am creating a simple Multi-layered feed forward Neural Network using AForge.net NN library. My NN is a 3 Layered Activation Network trained with Supervised Learning

approach using BackPropogation Learning algorithm.

Following are my initial settings:

//learning rate
learningRate=0.1;

//momentum value
momentum=0;

//alpha value for bipolar sigmoid activation function
sigmoidAlphaValue=2.0;

//number of inputs to network
inputSize=5;

//number of outputs from network
predictionSize=1;

//iterations
iterations=10000;


// create multi-layer neural network
            ActivationNetwork network = new ActivationNetwork(new BipolarSigmoidFunction

(sigmoidAlphaValue), 5, 5 + 1, 3, 1);

//5 inputs
//6 neurons in input layer
//3 neurons in hidden layer
//1 neuron in output layer

// create teacher
BackPropagationLearning teacher = new BackPropagationLearning(network);

// set learning rate and momentum
teacher.LearningRate = learningRate;
teacher.Momentum = momentum;

Now I have some input series which looks like this,
1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20

Using window sliding method (as described here) for input as a time series, my input and

expected output array looks something like this

//Iteration #1
double input[][] = new input[0][5] {1,2,3,4,5};
double output[][] = new output[0][0] {6};

//Iteration #2
double input[][] = new input[1][5] {2,3,4,5,6};
double output[][] = new output[1][0] {7};

//Iteration #3
double input[][] = new input[2][5] {3,4,5,6,7};
double output[][] = new output[2][0] {8};
.
.
.
//Iteration #n
double input[][] = new input[n][5] {15,16,17,18,19};
double output[][] = new output[n][0] {20};

After 10k iterations as such using

teacher.RunEpoch(input, output);

my network is successfully trained for the given training set. So now, if I compute using inputs as 4,5,6,7,8 the network successfully gives 9 as answer fantastic!

However, when the input is provided as 21,22,23,24,25 the NN fails to produce 26!

My Question:
How do I train my network to accept such unencountered inputs of such fashion to produce a correct sequence pattern as found in training set during learning?

Best Answer

I'm going to take a stab at this and say it could be a problem with normalization boundaries.

I'm not familiar with the AForge.net NN library, but at some point your data should be normalized to fit between 0 and 1.

At some point, the normalization process detected 1 as the minimum value and 20 as the max value, and from those bounds, every value is converted to fit between 0 and 1. For example,

1  -> 1/20 = 0.05
...
19 -> 19/20 = 0.95
20 -> 20/20 = 1

When you exceed these bounds later, you're normalization no longer produces values between 0 and 1 and this really wrecks havoc on the network.

25 -> 25/20 = 1.25

What you could do is ensure your normalization factors in your true max and min bounds.