MATLAB: NARnet Closed loop prediction results not good

closed loopforecastingnarnetneuralneural networkprediction

I had issues with the dimensions of my input codes, so I decided to stick with using the NARnet and only the volume of Oil Produced as my input (1050×1 double).
NMSEs =
0.0572
NMSEc =
6.0875
My Questions are:
1. What do the NMSE values mean for my network?
2. The error autocorrelation and input-error autocorrelation look good? ( i guess!) what do they also mean for my network?
3. The open loop results look right but the closed loop network just looks awful, please help!
Thank You
Here is my code:
clc
plt=0;
% Autoregression Time-Series Problem with a NAR Neural Network
% Created Sat May 16 23:01:03 WAT 2015
%


% This script assumes this variable is defined:
%
% PInput - feedback time series. 1050x1double
T = tonndata(PInput,false,false);
N = length (T);
% Choose a Training Function
%
trainFcn = 'trainlm'; % Levenberg-Marquardt
% Create a Nonlinear Autoregressive Network
feedbackDelays = 1:4;
hiddenLayerSize = 6;
net = narnet(feedbackDelays,hiddenLayerSize,'open',trainFcn);
rng ('default')
% Choose Feedback Pre/Post-Processing Functions
% Settings for feedback input are automatically applied to feedback output
% For a list of all processing functions type: help nnprocess
net.input.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer states.
% Using PREPARETS allows you to keep your original time series data unchanged, while
% easily customizing it for networks with differing numbers of delays, with
% open loop or closed loop feedback modes.
[Xs Xsi Asi Ts] = preparets(net,{},{},T);
ts1 = cell2mat( Ts );
plt = plt+1; figure(plt), hold on
plot( 5:N, ts1, 'LineWidth', 2 )
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivide
net.divideFcn = 'divideblock'; % Divide data randomly
net.divideMode = 'time'; % Divide up every value
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse'; % Mean squared error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate','plotresponse', ...
'ploterrcorr', 'plotinerrcorr'};
% Train the Network
[net tr Ys Es Af Xf] = train(net,Xs,Ts,Xsi,Asi);
ys1=cell2mat(Ys);
plot(5:N, ys1, 'ro', 'LineWidth', 2 )
legend( 'TARGET', 'OUTPUT' )
title( 'OPENLOOP NARNET RESULTS' )
%
Es = gsubtract( Ts, Ys )
%view( net )
NMSEs = mse( Es ) /var( ts1,1 )
% Test the Network
y = net(Xs,Xsi,Asi);
e = gsubtract(Ts,y);
performance = perform(net,Ts,y);
% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(Ts,tr.trainMask);
valTargets = gmultiply(Ts,tr.valMask);
testTargets = gmultiply(Ts,tr.testMask);
trainPerformance = perform(net,trainTargets,y);
valPerformance = perform(net,valTargets,y);
testPerformance = perform(net,testTargets,y);
% Closed Loop Network
% For multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
%netc = closeloop(net);
%[xc,xic,aic,tc] = preparets(netc,{},{},T);
%yc = netc(xc,xic,aic);
%perfc = perform(net,tc,yc);
[ netc Xci Aci ] = closeloop(net,Xsi,Asi);
%view(netc)
[Xc,Xci,Aci,Tc] = preparets(netc,{},{},Ts);
[ Yc Xcf Acf ] = netc(Xc,Xci,Aci);
Ec = gsubtract(Tc,Yc);
yc1 = cell2mat(Yc);
tc = ts1;
NMSEc = mse(Ec) /var(tc,1)
% Multi-step Prediction
Xc2 = cell(1,N);
[ Yc2 Xcf2 Acf2 ] = netc( Xc2, Xcf, Acf );
yc2 = cell2mat(Yc2);
plt = plt+1; figure(plt), hold on
plot( 5:N, tc, 'LineWidth', 2 )
plot( 9:N, yc1, 'ro', 'LineWidth', 2 )
plot( N+1:2*N, yc2, 'o', 'LineWidth', 2 )
plot( N+1:2*N, yc2, 'r', 'LineWidth', 2 )
%axis( [ 0 2*N+2 0 1.3 ] )
legend( 'TARGET', 'OUTPUT' , 'TARGETLESS PREDICTION')
title( 'CLOSED LOOP NARNET RESULTS' )
<<
<<
<<
<<
>>
>>
>>
>>

Best Answer

I haven't looked at your code because you haven't looked at my previous posts regarding the transition from openloop to closeloop.
In more than one of them I clearly state that if the closed loop performance is significantly worst than the openloop performance, it usually means that small open loop output errors are being fedback to the input and accumulating.
If the openloop errors are not too large, just train the closeloop configuration initialized with the weights obtained via the openloop training.
Typically, training the closeloop configuration from random initial weights takes a VERY LONG TIME. Thus the use of openloop + closeloop technique.
However, sometimes the closeloop followup does not improve performance enough.
Then there are two obvious choices.
1. Design the closeloop configuration
from scratch.
2. Redo the openloop+closeloop design
with different initial weights, and/or
different lags and/or different number
of hidden nodes and/or different number
of epochs, and/or different ...
Guess which one I would choose?
Hope this helps.
Thank you for formally accepting my answer
Greg