MATLAB: Neural network programming error.

Deep Learning Toolboxneural networktrain

clear all;clc
a=rand(1,1000);
b=rand(1,1000);
c=rand(1,1000);
y=a*2+b*3+c*5
I=[a; b; c;]
T=y
net = newff([0 1;0 1 ;0 1],[10 1]);
net=train(net,I,T);
J1=sim(net,I);
Op=sim(net,[1 1 1]');
As i am testing system with inputs [1 1 1] then the output must be neat to 10, but it gives 1. what is wrong with this code?

Best Answer

clear all; clc
rng('default') % Initialized the RNG
a = rand(1,1000);
b = rand(1,1000);
c = rand(1,1000);
y = a*2+b*3+c*5; % Added semicolon


I = [a; b; c;]; % Added semicolon
T = y; % Added semicolon
net = newff([0 1;0 1 ;0 1],[10 1]);
net=train(net,I,T);
J1=sim(net,I);
MSE = mse(J1-T)
Op=sim(net,[1 1 1]')
% Warning: NEWFF used in an obsolete
% way.
% > In obs_use at 18
% In newff>create_network at 127
% In newff at 102
% In Untitledgh at 9
% See help for NEWFF to update calls to
% the new argument list.
% MSE = 19.6749
% Op = 1
net = newff( I ,T ,10 ); % Current version
net=train(net,I,T);
J1=sim(net,I);
MSE = mse(J1-T)
Op=sim(net,[1 1 1]')
% MSE = 4.4650e-009
% Op = 9.9969
To be sure the obsolete version contains a bug,
Ntrials = 15
for i=1:Ntrials
net = newff([0 1;0 1 ;0 1],[10 1]);
net=train(net,I,T);
J1=sim(net,I);
MSE(i) = mse(J1-T);
Op(i)=sim(net,[1 1 1]');
end
result = [ MSE' OP' ]
% result =
%
% 19.6749 1.0000












% 19.6749 1.0000
% 19.6749 1.0000
% 19.6749 1.0000
% 19.7560 1.0000
% 19.6749 1.0000
% 19.6749 1.0000
% 19.6749 1.0000
% 19.6749 1.0000
% 19.6749 1.0000
% 19.6749 1.0000
% 19.6749 1.0000
% 22.1988 1.0000
% 19.6749 1.0000
% 19.6749 1.0000
Hope this helps
Thank you for formally accepting my answer
Greg