Dear all,
I use initzero to set all the weights and biases of a neural network equal to 0 before training. The training stops at the first or second iteration with all resulting weights which are unexpectedly 0. This is not correct. By the way, I used matlab 2014a and tried several train functions (trainlm, trainbr, trainscg…). Hope that you can give me a hint how to deal with this prolbem. Thanks for reading. My code is in the following:
[X,T] = maglev_dataset;inputDelays = 1:3; % input delay vector
outputDelays = 1:3; % input delay vectorhiddenSizes = [ 3 2 2]; % network structure (number of neurons per hidden layer)
% training algorithm
trainFcn = 'trainlm'; % Levenberg-Marquardt
% trainFcn = 'trainbr'; % Levenberg-Marquardt
net = narxnet(inputDelays, outputDelays, hiddenSizes,'open',trainFcn);%%define transfer function
net.layers{1}.transferFcn = 'tansig';net.layers{2}.transferFcn = 'tansig';net.layers{3}.transferFcn = 'tansig';net.layers{4}.transferFcn = 'purelin';%%prepare time-series for training network
[x,xi,ai,t] = preparets(net,X,{},T);%%division scheme of data for validation, test...
net.divideFcn = 'divideblock'; %
net.divideMode = 'value'; % Divide up every value
net.divideParam.trainRatio = 0.8;net.divideParam.valRatio = 0.1;net.divideParam.testRatio = 0.1;%%training parameters
net.trainParam.showWindow = 1;%%initialize all weights and biases before training
INITWEIGHTS = 1;if INITWEIGHTSnet.initFcn = 'initlay';for i=1:size(net.layers,1) net.layers{i}.initFcn = 'initwb'; end initialWeightsFunction = 'initzero'; % initialWeightsFunction = 'midpoint';
% initialWeightsFunction = 'rands';
initialBiasesFunction = 'initzero'; % initialBiasesFunction = 'rands';
for i=1:size(net.inputWeights,1) for j=1:size(net.inputWeights,2) if ~isempty(net.inputWeights{i,j}) net.inputWeights{i,j}.initFcn = initialWeightsFunction; % if initialWeightsFunction == 'midpoint'
% net.inputWeights{i,j}.weightFcn = '';
% end
end endendfor i=1:size(net.layerWeights,1) for j=1:size(net.layerWeights,2) if ~isempty(net.layerWeights{i,j}) net.layerWeights{i,j}.initFcn = initialWeightsFunction; end endendfor i=1:size(net.biases,1) if ~isempty(net.biases{i}) net.biases{i}.initFcn = initialBiasesFunction; endend% initialize the weights and the biases of the network
net=init(net);end%%training network
[net, tr] = train(net,x,t,xi,ai);
Best Answer