MATLAB: How to simulate a NARX neural network after being trained and tested

narx matlab r2012b neural network

Hi,
Using the NN tool box (MATLAB R2012b), I trained a NARX NN with data vectors X for input and Y for output. Now I would to know if the NARX NN developped is capable to predict the output of a given input A for example. Could someone tell me how can i do that? is there any specific command to use? I tried sim(A) but it didn't work.
Thank you in advance for your reponse,
Amine

Best Answer

% How can I simulate a NARX neural network after being trained and tested?
1. You have not suggested a MATLAB nndataset to use as an example. Please choose one or post one of your own.
help nndatasets
Input-Output Time-Series Prediction, Forecasting, Dyanamic modelling
Nonlinear autoregression, System identification and Filtering
Input-output time series problems consist of predicting the next value
of one time-series given another time-series. Past values of both series
(for best accuracy), or only one of the series (for a simpler system)
may be used to predict the target series.
simpleseries_dataset - Simple time-series prediction dataset.
simplenarx_dataset - Simple time-series prediction dataset.
exchanger_dataset - Heat exchanger dataset.
maglev_dataset - Magnetic levitation dataset.
ph_dataset - Solution PH dataset.
pollution_dataset - Pollution mortality dataset.
refmodel_dataset - Reference model dataset
robotarm_dataset - Robot arm dataset
valve_dataset - Valve fluid flow dataset.
inputSeries = tonndata(x,true,false); % Change notation to cell seqences X,
targetSeries = tonndata(t,true,false); % T and corresponding matrices x,t
% Create a Nonlinear Autoregressive Network with External Input
inputDelays = 1:2; feedbackDelays = 1:1;
2. Have the correlation values at the lags in the chosen row vectors ID and FD been investigated to make sure that they are significant for the target autocorrelation function and target/input crosscorrelation function?
3. Given the number of outputs, and trn/val/tst data division ratio, how many training equations, Ntrneq, do you have?
hiddenLayerSize = 10;
4. Given ID, FD and H = 10, how many unknown weights, Nw, do you have to estimate? Do you have more unknowns than equations? What is the estimation number of degrees of freedom, Ndof = Ntrneq-Nw? Are you considering to change H or try a range of values?
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);
[inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{},targetSeries);
% Setup Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100; net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
5. These are defaults. Why bother?
6. Why are you accepting the default random data division function 'dividerand' that destroys serial correlations? Override with 'divideblock' or 'divideind' to maintain order and spacing.
7.Separate t into ttrn/tval/ttst partitions and calculate the reference MSEs obtained from the naive constant output model estimate ytrn00 = mean(ttrn,2).
MSEtrn00 = mean(var(ttrn,1)); MSEtrn00a = Ntrn*MSEtrn00/(Ntrn-1)
MSEval00 = mse(tval-ytrn00); MSEtst00=mse(ttst-ytrn00);
8. Set practical training goals
if Ndof > 0
net.trainParam.goal = 0.01*Ndof*MSEtrn00a/Ntrneq; % R2trna = 0.99
else
net.trainParam.goal = 0.01*MSEtrn00; % R2trn = 0.99
end
net.trainParam.min_grad = MSEtrn00/200;
% Train the Network
[net,tr] =train(net,inputs,targets,inputStates,layerStates);
% Test the Network
outputs = net(inputs,inputStates,layerStates);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
9. ALTERNATIVE: Forget this and get all of your information (training, validation, test and total) from the training structure tr:
tr = tr
10. The best evaluation criterion is the test coefficient of determination (R2tst) evaluated at the maximum of R2val
R2val = 1-tr.vperf(tr.best_epoch)/MSEval00
R2tst = 1-tr.tperf(tr.best_epoch)/MSEtst00
% View the Network
view(net)
% Obtain the Closed Loop Design
netc = closeloop(net); netc.name = [net.name ' - Closed Loop'];
view(netc);
% Change notation upper/lower case for cell/matrix, 'r' for removedelay
[ Xsc, Xic, Aic, Tsc ] = preparets(netc, inputSeries, {},targetSeries);
Ysc = netc(Xsc,Xic,Aic);
% MSEc = perform(netc,Tsc,Ysc)% closedLoopPerformance
ysc = cell2mat(Ysc);
tsc = cell2mat(Tsc);
MSEc00 = mse(tsc-ytrn00)
MSEc = mse(tsc-ysc)
R2c = 1-MSEc/MSEc00
netr = removedelay(net);
netr.name = [netr.name ' - Predict One Step Ahead'];
view(netr)
[Xsr,Xir Air Tir] = preparets(netr,inputSeries,{},targetSeries);
Ysr = nets(Xsr,Xir Air);
% MSEr= perform(netr,Tsr,Ysr) % earlyPredictPerformance
ysr = cell2mat(Ysr);
tsr = cell2mat(Tsr);
MSEr00 = mse(tsr-ytrn00)
MSEr = mse(tsr-ysr)
R2r = 1-MSEr/MSEr00
X and T dimension are the same and equal to 100. Now for example I would test the network with an input defined by:
for i=1:100 T(i)=(1+i)^-1 end
how can i do that? net(T)? sim(T)? netc(T)?
NO!
YOU CAN NOT DO THAT! THE NET IS ONLY VALID FOR SERIES THAT HAVE, APPROXIMATELY, THE SAME MEANS, VARIANCES, AND SIGNIFICANT AUTO AND CROSSCORRELATION LAGS!!
HOPE THIS HELPS.
THANK YOU FOR FORMALLY ACCEPTING MY ANSWER
Greg