MATLAB: For the project work I have used Elman neural network, with Resilient back propagation algorithm, Nguyen widrow algorithm for generating initial layer values. I observed lot difference between outputs in different trials

Deep Learning Toolboxennrbp

For my project work I have used Elman neural network(ENN), with Resilient back propagation algorithm, Nguyen widrow algorithm for generating initial layer values. I observed lot of difference between outputs for different trial, when for the first time I trained network it gave 94% accuracy and the second time with same inputs and targets I got 64% only. After training for the first time I didn't saved the network. please suggest me ways to avoid the difference between consecutive trials. I am using Matlab 2010 and I created ENN using nntool, and then using code I turned it into 'trainrp' as creating ENN with 'trainrp' gave me error.

Best Answer

Weights are initialized randomly for Neural Networks unless you initialize them manually. If you are getting as large of a difference as you are seeing, it should suggest to you that your network is not robust.
You can control the random number seed to reproduce particular networks. In R2010 you should see the documentation for the randstream class; R2011a or so introduced rng() as a simpler way to set the random seed.