MATLAB: MLP classification: what is the problem in the code

Deep Learning Toolboxrecognition pattern

I would like to understand why the neural network with MLP I built works badly. The network should be a universal classifier because it has two hidden layers but, with the data set I use, my neural network does not train well. My network is built using
  1. in the first and second layers a sigmoidal transfer function
  2. in the output layer a soft max function"inputs" file is a 3×120 matrix: 3 features and 120 observations" targets" file is a 3×120 matrix (representative of 3 different classes).
In the example code I used a network with 40 neurons in the first layer and 20 in the second layer.
I notice two anomalies
  1. if I perform view (net) I see that the number of outputs of the output layer is 2 while the output box is 3: what does this mean and why Matlab does this?
  2. in the output layer a soft max functionIf I do sum (net ([i1;i2; i3]) I have a value different from 1 but it should be 1 because in the last layer there is a softmax function.
In the following I write my code while I attach inputs and outputs file
Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
trainFcn = 'trainbr'; % Scaled conjugate gradient backpropagation.
% Create a Pattern Recognition Network
hiddenLayerSize = [40 20];
net=feedforwardnet(hiddenLayerSize);%crea rete feedforward
%imposta funzione trasferimento
net.layers{1}.transferFcn = 'tansig'
net.layers{2}.transferFcn = 'tansig'
net.layers{3}.transferFcn = 'softmax'
% Setup Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Train the Network
[net,tr] = train(net,x,t);
% Test the Network
y = net(x);
e = gsubtract(t,y);
performance = perform(net,t,y)
tind = vec2ind(t);
yind = vec2ind(y);
percentErrors = sum(tind ~= yind)/numel(tind);
% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
figure, plotconfusion(t,y)
%figure, plotroc(t,y)

Best Answer

There are too many basic concepts of which you are unaware. Unfortunately, I only have time to list some of them.
1. One hidden layer is sufficient unless you have SPECIFIC operations you wish to perform (e.g., specific image feature extraction).
2. Using a validation subset helps prevent poor performance on nontraining (validation, testing and unseen) data.
3. Too many weights can cause instability. In particular, poor performance on nontraining data (NN vendors MUST be crucially aware of this ($$$!!!)).
4. It is best to stay as close as possible to the MATLAB example code and default parameter values found in the help and doc documentation.
5. A quick look at my QUICKIES posts in the NEWSREADER may help understand what few changes are really necessary.
6. Think in terms of the example code and using multiple initial random weight trials to minimize the number of hidden nodes subject to obtaining a good performance on all of the data (e.g.
mse(t-y) <= 0.01*mse(t-mean(t,2))
(Obviously comparing your design with the naïve attempt to use the best constant value for the model).
I have quite a few tutorials in the NEWSREADER which may help.
Hope this helps.
Thank you for formally accepting my answer
Greg