Hi all!
I am using neural network models in matlab, and now I am facing a problem about the weights in NN training.
Basically, I have a multiple inputs multiple outputs recurrent neural network and the network is generated as
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize),
and the corresponding mathematical model is like
Y(k+1) = A*Y(k) + f(U),
where U is the input of the model and Y is the output (which also feedback to the input), A is a unknown constant matrix and f() is a unknown nonlinear function of U .
The whole training process is fine, and I can get very small training error. However, when I want to use this network for a prediction, I have a problem, which is that, the influence from the feedback is too much.
For example, let's say the network is net , and I have two different inputs U1 = [0 1 0] and U2 = [1 1 1] . When I want to have a one step prediction regarding to these two different inputs, I use
netc = closeloop(net);u1 = tonndata(U1,false,false);u2 = tonndata(U2,false,false);[a1,b1,c1,d1] = preparets(netc,u1,{},targets);[a2,b2,c2,d2] = preparets(netc,u2,{},targets);outputs1 = netc(a1,b1,c1);outputs2 = netc(a2,b2,c2);
The result is that, outputs1 is exactly identical to outputs2 . I tried a lot of times and used all different inputs. If the feedback Y(k) is the same, the final predicted output Y(k+1) is always the same. I guess this might because of the noise from the training data, but I do have a very good training data set with very small noises.
So now I am wondering if there is any method that, I can increase the influence from the input U and decrease the influence from Y , meanwhile keeping a certain training accuracy.
Thank you very much for the help!
Best Answer