MATLAB: 2-Class Problem with patternnet

Deep Learning Toolboxneural networkpatternet

Hi, I have a problem using the NN toolbox a neural network shall be trained to recognize a two class problem. I used the default settings ( dividerand , 10 hidden neurons, divide radio 0.7, 0.15, 0.15) and my input is a 9xn matrix and my target is a 2xn matrix ([1; 0]for class one and [0; 1] for class two for each sample), where n=1012. the ratio of the classes are about 50:50 .this is the confusion matrix
This is the code that i used :
rng('default');
x = patientInputs;
t = patientTargets ;
inputs=mapminmax(x);
targets=t;
size(inputs);
trainFcn = 'trainscg'; % Scaled conjugate gradient backpropagation.
% Create a Pattern Recognition Network
hiddenLayerSize =10;
net = patternnet(hiddenLayerSize);
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'};
net.divideFcn = 'dividerand'; % Divide data randomly
net.divideMode = 'sample'; % Divide up every sample
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
net.performFcn = 'mse'; % Cross-Entropy
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ...
'plotconfusion', 'plotroc'};
net.trainParam.max_fail =55;
net.trainParam.min_grad=1e-10;
net.trainParam.show=10;
net.trainParam.lr=0.01;
net.trainParam.epochs=90;
net.trainParam.goal=0.001;
% Train the Network
[net,tr] = train(net,inputs,targets);
y = net(inputs);
e = gsubtract(targets,y);
performance = perform(net,targets,y)
tind = vec2ind(targets);
yind = vec2ind(y);
percentErrors = sum(tind ~= yind)/numel(tind);
% Recalculate Training, Validation and Test Performance
trainTargets = t .* tr.trainMask{1};
valTargets = t .* tr.valMask{1};
testTargets = t .* tr.testMask{1};
trainPerformance = perform(net,trainTargets,y)
valPerformance = perform(net,valTargets,y)
testPerformance = perform(net,testTargets,y)
% View the Network
view(net)
Can anyone tell me how to solve this problem and please go easy on me because newbie in matla and neural network .
Thanks

Best Answer

In order to insure stability with respect to changes in operating conditions I recommend
MINIMIZING THE NUMBER OF HIDDEN NODES
subject to the normalized mean-square-error constraint
NMSE = mse(error)/mean(var(target',1) < 0.01 % i.e., Rsquare >= 0.99
Although I derived this for regression, it works extremely well for classificaton.
I have zillions of posted examples in both the NEWSGROUP and ANSWERS. Try searching using
greg patternnet
Hope this helps.
Thank you for formally accepting my answer
Greg