I'm trying to breakdown the MATLAB neural network GUI by working out what each feature does. I'm keeping it simple by using the default training method (scg), and the MATLAB wine dataset for training/testing. For the time being, and for experimentation, I've removed the validation dataset, and I've set the NN up with 50 hidden nodes.
What I can't work out is why the results it produces are exactly the same each time. It takes exactly the same amount of epochs to get to the minimum gradient, performance and gradient values are exactly the same, and the results produced in the confusion matrix are exactly the same. The only thing I can think of is that the data splitting and initialisation of weights are not randomised, but everywhere I look online suggests that (by default) MATLAB does indeed randomise those parameters.
What am I missing? Are the weights and datasets not randomised after all? Code being used is below.
% Load MATLAB default wine dataset.
[x1,t1] = wine_dataset;% Create net, 50 hidden nodes.
net = patternnet(50);% Split the data into a 75% training and 25% testing group. Validation
% removed.
net.divideParam.trainRatio = 3/4;net.divideParam.valRatio = 0;net.divideParam.testRatio = 1/4;% Train the data.
train(net,x1,t1);
Best Answer