Hi,
I am running a MIMO NARX using limited data of 22 time steps for each I and O. To get around the inaccuracy of the prediction by the network due to data limitation, I plan to run a somewhat Monte Carlo-like experiment, where I retrain and reinitialize the training weights of the model for numerous times and plot the respective predictions to get a distribution. Each training set has to fulfill a set criteria of R and MSE values by using a while loop. I have my own doubts about this method as I am not an efficient coder. Initial attempts have some success but the while loop gets stuck frequently, i.e for the first four runs of the for loop I can get criteria fulfilling networks and the network exits the while loop but after that training while loop gets stuck, meaning it cannot exit the loop due to criteria non-conformance. I'm hoping someone can identify the flaws in my methodology and kindly correct them. Thanks!
for i=1:10 trainFcn = 'trainlm'; % Bayesian Regularization backpropagation.
% Create a Nonlinear Autoregressive Network with External Input
inputDelays = 1:2; feedbackDelays = 1:2; hiddenLayerSize=10; net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize,'open',trainFcn); % Choose Input and Feedback Pre/Post-Processing Functions
% Settings for feedback input are automatically applied to feedback output
% For a list of all processing functions type: help nnprocess
% Customize input parameters at: net.inputs{i}.processParam
% Customize output parameters at: net.outputs{i}.processParam
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'}; net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'}; % Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer
% states. Using PREPARETS allows you to keep your original time series data
% unchanged, while easily customizing it for networks with differing
% numbers of delays, with open loop or closed loop feedback modes.
[x,xi,ai,t] = preparets(net,X,{},T); % Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivide
net.divideFcn = 'divideblock'; % Divide data randomly
net.divideMode = 'time'; % Divide up every sample
net.divideParam.trainRatio = 70/100; net.divideParam.valRatio = 15/100; net.divideParam.testRatio = 15/100; % Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse'; % Mean Squared Error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate', 'ploterrhist', ... 'plotregression', 'plotresponse', 'ploterrcorr', 'plotinerrcorr'}; % Test the Network
performance=1;% to setup an initial value for while loop
reg0=[0.5;0.5;0.5;0.5];% to setup an initial value for while loop while (reg0(1)<0.9)|(reg0(2)<0.9)|(reg0(3)<0.9)|(reg0(4)<0.9)|(performance>0.01) % Train the Network
[net,tr] = train(net,x,t,xi,ai); % Test the Network y = net(x,xi,ai); e = gsubtract(t,y); performance = perform(net,t,y); [reg m b]=regression(t,y); reg0=reg; end % Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(t,tr.trainMask); valTargets = gmultiply(t,tr.valMask); testTargets = gmultiply(t,tr.testMask); trainPerformance = perform(net,trainTargets,y) valPerformance = perform(net,valTargets,y) testPerformance = perform(net,testTargets,y) % %%view the Network
%view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
%figure, plotregression(t,y)
%figure, plotresponse(t,y)
%figure, ploterrcorr(e)
%figure, plotinerrcorr(x,e)
% Closed Loop Network
% Use this network to do multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
netc = closeloop(net); netc.name = [net.name ' - Closed Loop']; %%view(netc)
[xc,xic,aic,tc] = preparets(netc,X,{},T); yc = netc(xc,xic,aic); closedLoopPerformance = perform(net,tc,yc) % Multi-step Prediction
% Sometimes it is useful to simulate a network in open-loop form for as
% long as there is known output data, and then switch to closed-loop form
% to perform multistep prediction while providing only the external input.
% Here all but 5 timesteps of the input series and target series are used
% to simulate the network in open-loop form, taking advantage of the higher
% accuracy that providing the target series produces:
%%numTimesteps = size(x,2);
%%knownOutputTimesteps = 1:(numTimesteps-5);
%%predictOutputTimesteps = (numTimesteps-4):numTimesteps;
%%X1 = X(:,knownOutputTimesteps);
%%T1 = T(:,knownOutputTimesteps);
%%[x1,xio,aio] = preparets(net,X1,{},T1);
%%[y1,xfo,afo] = net(x1,xio,aio);
% Next the the network and its final states will be converted to
% closed-loop form to make five predictions with only the five inputs
% provided.
%%x2 = X(1,predictOutputTimesteps);
%%[netc,xic,aic] = closeloop(net,xfo,afo);
%%[y2,xfc,afc] = netc(x2,xic,aic);
%%multiStepPerformance = perform(net,T(1,predictOutputTimesteps),y2)
% Alternate predictions can be made for different values of x2, or further
% predictions can be made by continuing simulation with additional external
% inputs and the last closed-loop states xfc and afc.
% Step-Ahead Prediction Network
% For some applications it helps to get the prediction a timestep early.
% The original network returns predicted y(t+1) at the same time it is
% given y(t+1). For some applications such as decision making, it would
% help to have predicted y(t+1) once y(t) is available, but before the
% actual y(t+1) occurs. The network can be made to return its output a
% timestep early by removing one delay so that its minimal tap delay is now
% 0 instead of 1. The new network returns the same outputs as the original
% network, but outputs are shifted left one timestep.
nets = removedelay(net); nets.name = [net.name ' - Predict One Step Ahead']; %%view(nets)
[xs,xis,ais,ts] = preparets(nets,X,{},T); ys = nets(xs,xis,ais); stepAheadPerformance = perform(nets,ts,ys); ys=transpose(cell2mat(ys)); ; for i=1:size(ys,2) ypred(:,i)=(2.*ys(:,i)-1)*(minB(i)-maxB(i))+meanB(i); end %disp(ypred);
ypred0=[ypred0 ypred];% to store predictions
regstore=[regstore reg0];% to store performance
clearvars net end
Best Answer