MATLAB: Poor performance on Close-loop Narnet

Deep Learning ToolboxMATLABnarnetneural network

Hi there, I am using a Narnet for multi-step ahead prediction of a non-linear time series. I read some posts on the newsgroup and wrote a script to train the neural network.
1. About the problem:
* The length of the series is about 1.3M points (i.e. 1 month);
* The sampling time of the series is 2 second and
* The prediction horizon is 30 steps (i.e. 1 minute);
* Only the first 2 hours (3600 points) are used to train the networks.
* d=300 and H =3 are chosen after checking the autocorr
2. MSE:
* The MSE of the open-loop net, nMSEo, is 2.8211e-04
* the MSE of the closed-loop net, nMSEc, is 1.4814
* the MSE of the re-trained close-loop net, nMSEcr, is 0.8838
3. My questions are:
* Is there any errors/misuses on the code/toolbox functions.
* Is that normal to have such a huge degradation (>3000times) on the network performance after closing the loop?
* Is closed-loop network the only way to make multi-step-ahead-prediction with NARNET.
* I am thinking to re-sample the series with a range of sampling rate i.e. 4, 6, 8, ..., 60 seconds, then use parallel open-loop prediction to solve the problem, do you think this is feasible?
* since the past data don't have much influence on the current value, should we use the new coming data to re-train the networks on every time interval to update the networks.
4. code:
filename=sprintf('data/%d %d.xlsx', month, year);
[num,txt,~] = xlsread(filename,sheetname);
nStep=size(num,1)-1;
nDay=size(txt,2);
% use 1 day only
nDay=1;
timestamp=num(:,1);
data=num(:,(1:nDay)+1);
date=txt(1:nDay);
signal.date = gadd(timestamp(1:end-1,1),datenum(date,'dd/mm/yyyy')');
signal.date = reshape(signal.date,[nDay*nStep,1]);
signal.data = reshape(data(1:end-1,1:end),[nDay*nStep,1]);
y=signal';
% use first 2 hours only
y=signal.data(1:2*length(signal)/24)';
Y=num2cell(y);
N = length(y);
% net param
d = 300; FD = 1:d; H = 3;
Horizon = 30;
neto = narnet( FD, H );
neto.divideFcn = 'divideblock';
neto.trainParam.showWindow = false;
%view(neto)
% preparation
[ Xo, Xoi, Aoi, To ] = preparets( neto, {}, {}, Y );
to = cell2mat( To ); zto = zscore(to,1);
% varto1 = mean(var(to',1));
% minmaxto = minmax([ to ; zto ]);
rng( 'default' )
[ neto, tro, Yo, Eo, Aof, Xof ] = train( neto, Xo, To, Xoi, Aoi );
[ Yo, Xof, Aof ] = neto( Xo, Xoi, Aoi );
% evaluate open-loop network
Eo = gsubtract( To, Yo );
perform(neto, To, Yo);
%view( neto );
% NMSEo = mse( Eo ) /varto1
NMSEo = mse( Eo ) / var(to,1)
yo = cell2mat( Yo );
% plot
plt = plt+1; figure(plt), hold on
plot( d+1:N, to);
plot( d+1:N, yo, 'r--');
axis( [ 0 N+d -1 1 ] );
legend( 'TARGET', 'OUTPUT' );
title( 'OPENLOOP NARNET RESULTS' );
%%close the loop
[ netc Xci Aci ] = closeloop( neto, Xoi, Aoi );
% view( netc )
[ Xc, Xci, Aci, Tc ] = preparets( netc, {}, {}, Y );
if isequal( Tc, To )
tc = to ; % 1
else
tc = cell2mat( Tc );
end
[ Yc Xcf Acf ] = netc( Xc, Xci, Aci );
% evaluate closed-loop network
Ec = gsubtract( Tc, Yc );
% perform(netc, Tc, Yc);
yc = cell2mat( Yc );
NMSEc = mse(Ec) /var(tc,1)
plt = plt+1; figure(plt), hold on
plot( d+1:N, tc )
plot( d+1:N, yc, 'r--' )
legend('TARGET', 'OUTPUT')
title('CLOSELOOP NARNET RESULTS');
%%Re-train
[netc, trc, Ycr, Ecr, Acrf, Xcrf] = train(netc, Xc, Tc, Xci, Aci);
[Ycr, Xcrf, Acrf] = netc(Xc, Xci, Aci);
Ecr = gsubtract( Tc, Ycr );
perform(netc, Tc, Ycr);
ycr = cell2mat(Ycr);
NMSEcr = mse(Ecr) /var(tc,1)
Xc2 = cell(1,Horizon);
[ Yc2 Xcrf Acrf2 ] = netc( Xc2, Xcrf, Acrf );
yc2 = cell2mat(Yc2);
plt = plt + 1;
figure(plt), hold on
plot(d+1:N, tc, 'b')
plot(d+1:N, ycr, 'g--')
plot( N+1:N+Horizon, yc2, 'ro', 'LineWidth', 2 )
plot( N+1:N+Horizon, yc2, 'r', 'LineWidth', 2 )
legend('TARGET', 'OUTPUT', 'TARGETLESS PREDICTION') % Fix the legend colors
title('TRAINED CLOSELOOP NARNET RESULTS');

Best Answer

From
help nndatabase
and
doc nndatabase
You will find a list of MATLAB data sets you can use to practice with
I have chosen a few for tutorials. Months ago I encountered the same problem :
1. Good & Great OL performance
2. Poor, Good and Great CL performance after closing the loop
3. Both Better and Worse performance after training the CL
net initialized with the final OL weights
4. I have not had time to pursue this further. However, it
definitely is high on my TO DO LIST.
5. Right now my priorities are
a. CELLAR FLOODING
b. HOSTING MY VACATIONING GRANDDAUGHTER
c. MATLAB INSTALLATION
d NEWSGROUP & ANSWERS
Hope this helps.
Greg