MATLAB: How can i use sim function in trained neural network

narxneural networksimulink

i have input 7*2601 and target 1*2601 data. I trained Narx network and got the some plots. I want to do forecasting with new data so i created new matrix with 7*1 matrix and put some data. But when i call "pre = net(testdata')';" its always gives the same errors which are
"Error using network/sim (line 271) Number of inputs does not match net.numInputs.
Error in network/subsref (line 16) otherwise, v = sim(vin,subs{:});"
I read all of Narx , multistep predictions, open-close loop predictions but cant solve it. Any ideas ? Thank you.
code:
>>
Data = xlsread('data.xlsx'); Target = xlsread('Target.xlsx'); testdata= xlsread('testdata.xlsx'); X = tonndata(Data,false,false); T = tonndata(Target,false,false); trainFcn = 'trainlm';
inputDelays = 1:2; feedbackDelays = 1:2; hiddenLayerSize = 100; net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize,'open',trainFcn);
%net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'}; %net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
[x,xi,ai,t] = preparets(net,X,{},T);
net.divideFcn = 'dividerand'; % Divide data randomly net.divideMode = 'time'; % Divide up every sample net.divideParam.trainRatio = 70/100; net.divideParam.valRatio = 15/100; net.divideParam.testRatio = 15/100;
net.performFcn = 'mse';
[net,tr] = train(net,x,t,xi,ai);
y = net(x,xi,ai); e = gsubtract(t,y); performance = perform(net,t,y);
trainTargets = gmultiply(t,tr.trainMask); valTargets = gmultiply(t,tr.valMask); testTargets = gmultiply(t,tr.testMask); trainPerformance = perform(net,trainTargets,y); valPerformance = perform(net,valTargets,y); testPerformance = perform(net,testTargets,y);
view(net)
pre = net(testdata')';

Best Answer

Data = xlsread('data.xlsx');
Target = xlsread('Target.xlsx');
testdata = xlsread('testdata.xlsx');
X = tonndata(Data,false,false);
T = tonndata(Target,false,false);
whos % 1. Check class and size
% 2. How many training equations?
% What is the average target variance to be modeled?
trainFcn = 'trainlm';
inputDelays = 1:2;
feedbackDelays = 1:2;
%3. What makes you think these delay values are any good? What values correspond to significant auto and cross correlations?
hiddenLayerSize = 100;
%4. 100 hidden units probably generates too many unknown weights. How many equations vs unknowns?
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize,'open',trainFcn);
%net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
%net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
[x,xi,ai,t] = preparets(net,X,{},T);
% 5. Add additional subscript 'o' to denote 'openloop'. % 6. Preferable to use uppercase for cells, lower case for doubles
net.divideFcn = 'dividerand'; % Divide data randomly
% 7. UH...OH! If you do this the spacing within the trn/val/tst subsets will be nonuniform.
net.divideMode ='time'; % Divide up every sample
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
net.performFcn = 'mse';
%8. Why waste space explicitly specifying default values???
[net,tr] = train(net,x,t,xi,ai);
% 9. Forgot the final states on the LHS
y = net(x,xi,ai);
e = gsubtract(t,y);
performance = perform(net,t,y);
%. 10. Performance is not normalized w.r.t. average target variance. So how can you tell whethter it is good or not?
trainTargets = gmultiply(t,tr.trainMask);
valTargets = gmultiply(t,tr.valMask);
testTargets = gmultiply(t,tr.testMask);
trainPerformance = perform(net,trainTargets,y);
valPerformance = perform(net,valTargets,y);
testPerformance = perform(net,testTargets,y);
% 11. Same comment as 10
view(net)
%12. view right after train
pre = net(testdata')';
% 13. No input states obtained from output states in train output
% 14. Why haven't you reviewed my posts in the NEWSGROUP and ANSWERS? All of the above points are covered.
Hope this helps.
Thank you for formally accepting my answers
Greg