MATLAB: What is use of delays in time series problems

Deep Learning Toolboxneural network

Hello Everyone,
As a beginner i am trying to understand the use of neural networks in time series prediction. I am trying to develop a model which can predict a flood forecast, but i am not understanding what is use of Input and Target delays in the network and also how should i give multiple varibles as inputs as i have 4 input parameteres with me. but currently i am providing only two.
Below is code attached please let me know if it is correct
Data_Inputs=xlsread('Book1.xlsx'); % Import file
%The training data sample are randmonized by using the function'randperm'
Shuffling_Inputs=Data_Inputs(randperm(end),1:2); % integers (training sample)
Training_Set=Data_Inputs(1:end,1);%specific training set
Target_Set=Data_Inputs(1:end,2); %specific target set
Input=Training_Set'; %Convert to row
Target=Target_Set'; %Convert to row
X = con2seq(Input); %Convert to cell
T = con2seq(Target); %Convert to cell
% Create a Nonlinear Autoregressive Network with External Input
inputDelays = 1:4;
feedbackDelays = 1:4;
hiddenLayerSize = 10;
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);
net.trainParam.epochs=1000;
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares time series data
% for a particular network, shifting time by the minimum
% amount to fill input states and layer states.
% Using PREPARETS allows you to keep your original
% time series data unchanged, while easily customizing it
% for networks with differing numbers of delays, with
% open loop or closed loop feedback modes.
[inputs,inputStates,layerStates,targets] = …
preparets(net,X,{},T);
% Set up Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Train the Network
[net,tr] = train(net,inputs,targets,inputStates,layerStates);
% Test the Network
outputs = net(inputs,inputStates,layerStates);
errors = gsubtract(targets,outputs);
rmse=sqrt(mse(errors));
performance = perform(net,targets,outputs);
% View the Network
view(net)
% Plots
figure, plotperform(tr)
figure, plottrainstate(tr)
figure, plotregression(targets,outputs)
figure, plotresponse(targets,outputs)
%figure, ploterrcorr(errors)
%figure, plotinerrcorr(inputs,errors)
% Closed Loop Network
% Use this network to do multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
netc = closeloop(net);
netc.name = [net.name ' – Closed Loop'];
view(netc)
[xc,xic,aic,tc] = preparets(netc,X,{},T);
yc = netc(xc,xic,aic);
closedLoopPerformance = perform(netc,tc,yc);
% Early Prediction Network
% For some applications it helps to get the prediction a
% timestep early.
% The original network returns predicted y(t+1) at the same
% time it is given y(t+1).
% For some applications such as decision making, it would
% help to have predicted y(t+1) once y(t) is available, but
% before the actual y(t+1) occurs.
% The network can be made to return its output a timestep early % by removing one delay so that its minimal tap delay is now % 0 instead of 1. The new network returns the same outputs as % the original network, but outputs are shifted left one timestep.
nets = removedelay(net);
nets.name = [net.name ' – Predict One Step Ahead'];
view(nets)
[xs,xis,ais,ts] = preparets(nets,X,{},T);
ys = nets(xs,xis,ais);
earlyPredictPerformance = perform(nets,ts,ys);
%% 5. Multi-step ahead prediction
inputSeriesPred = [X(end-1:end),XVal];
targetSeriesPred = [T(end-1:end), con2seq(nan(1,N))];
[Xs,Xi,Ai,Ts] = preparets(netc,inputSeriesPred,{},targetSeriesPred);
yPred = netc(Xs,Xi,Ai);
perf = perform(net,yPred,targetSeriesVal);
figure;
plot([cell2mat(targetSeries),nan(1,N);
nan(1,length(targetSeries)),cell2mat(yPred);
nan(1,length(targetSeries)),cell2mat(targetSeriesVal)]')
legend('Original Targets','Network Predictions','Expected Outputs');

Best Answer

Go to
help nndatasets
doc nndatasets
and choose one of the narxnet eamples.
I will compare your results with mine.
I used the simpleseries_dataset. Results were not good. Comments:
1. You did not initialize RNG ==> Cannot reproduce results
2. Shuffled data and ruined auto and cross correlations
3. Did not optimize ID and FD using significant lags from auto and cross correlation functions.
4. Did not use multiple random weight initializations to optimize H given ID and FD
5. Ruined auto and cross correlations by using default dividerand instead of divideblock or divideind
6. Did not use tr to otain separate trn/val/tst results
7. Did not obtain final delays Xsf and Asf from train for multistep ahead prediction
8. No NMSE (normalized by average target variance), R2 or R2a results
9. No testing and retraining of netc after closing loop
10. No comparison of target, y and yc
12. Plot figure is messed up.}
See some of my posted designs. Search on
greg nncorr narxnet
Hope this helps.
Thank you for formally accepting my answer
Greg