%Peta on 2 Sep 2014 at 18:01
% Thanks for answering, I’m not sure what the ”autocorrelation function” is
Suggestions, not necessarily in order:
1. Use the command lookfor
then use the help and doc commands on functions that are listed. For example
2. Search in wikipedia
3. Search in the NEWSGROUP and ANSWERS. To reduce the results to a manageable size, use in combination with other reasonable searchwords e.g., some SUBSET of these
autocorrelation neural narnet nncorr greg
% but I when I change the number of hidden layers in the GUI and plot the “Error Autocorrelation” it basically looks exactly the same regardless of how many hidden layers I have. So I should set it to 1 in that case then?
I only use 1 hidden layer and try to minimize the number of hidden nodes subject to an upper bound on the MSE.
For efficient predictions FD should be a subset of the significant lags of the TARGET autocorrelation function. This calculation is independent of H. See some of my previous posts for examples
% I found the net.divideFcn in the script code and changed it like you said. But at point 4 and 5 I’m afraid I have pretty much no idea what you are talking about. I found nothing about Ntrails in the script, is this something I need to write myself? And if so is there any kind of example somewhere on how to do it? I’m afraid it is new to me.
Ntrials, not Ntrails. The number of trials for random initial weights given the number of hidden nodes.
I don't find the MATLAB help/doc script to be very helpful. For example, it says nothing about
1. Using capitals to indicate cell variables.
2. Using nncorr to find a reasonable subset of feedback delays
3. Using a for loop to find a reasonable value for H (e.g., Hmin:dH:Hmax)
4. Not using the default 'dividerand' because it destroys the correlations between the output and feedback signals
5. For each of numH candidate values for H, training success depends on starting with a good set of random initial weights. The best way to find one or more is to have an inner for loop over Ntrial random weight initializations that are created by the configure function.
6. Explicitly initializing the random number generator before the outer loop so that you can duplicate any of the numH*Ntrials designs
7. Often closing the loop on an openloop design to obtain netc does not yield acceptable results when inputted with original data. Therefore, the closeloop net should be trained beginning with the weights obtained from the openloop design to obtain netc2.
% And I’m not sure what to do with the LHS syntax thing, I did have xs ts xi and ai in my workspace so I tried adding the piece of code you wrote: % % [ net tr Ys Es Xf Af ] = train( net, Xs, Ts, Xi, Ai );
This code replaces the 3 step script
[ net tr ] = train( net, Xs, Ts, Xi, Ai );
[ Ys Xf Af ] = net( Xs, Xi, Ai );
Es = gsubtract(net,Ts,Ys);
Finally, to predict into the future M timesteps beyond the end of the target data
Xic2 = Xf;
Aic2 = Af;
Ypred = netc2( cell(1,M), Xic2, Aic2);
Hope this helps.
Thank you for formally accepting my answer
Greg
Best Answer