MATLAB: How to get best test error/accuracy with neural networks pattern recognition

Deep Learning Toolboxmatlab guineural network

This is Neural Network Pattern Recognition.I used a vec dataset 1*54149 and 1*54149 target and I'm trying to train my neural network to do binary classification (1 and 0).i want get best ? So please someone can help me ?. thank you in advance
clear all;
clc;
load vec; load target;
inputs = double(vec);
targets = double(target);
% Create a Pattern Recognition Network
hiddenLayerSize = 1;
%net = patternnet(hiddenLayerSize);
net = patternnet(hiddenLayerSize);
% Choose Input and Output Pre/Post-Processing Functions
% For a list of all processing functions type: help nnprocess
net.inputs{1}.processFcns = {'removeconstantrows','mapstd'};
net.outputs{2}.processFcns = {'removeconstantrows','mapstd'};
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivide
net.divideFcn = 'dividerand';
net.divideMode = 'sample'; % Divide up every sample
net.divideParam.trainRatio = 50/100;
net.divideParam.valRatio = 25/100;
net.divideParam.testRatio = 25/100;
% For a list of all training functions type: help nntrain
net.trainFcn = 'trainrp';
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse';
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ...
'plotregression', 'plotfit'};
% Train the Network
[net,tr] = train(net,inputs,targets);
% Test the Network
outputs = net(inputs);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs);
[tpr,fpr,thresholds] = roc(targets,outputs);
% Recalculate Training, Validation and Test Performance
trainTargets = targets .* tr.trainMask{1};
valTargets = targets .* tr.valMask{1};
testTargets = targets .* tr.testMask{1};
trainPerformance = perform(net,trainTargets,outputs);
valPerformance = perform(net,valTargets,outputs);
testPerformance = perform(net,testTargets,outputs);
% View the Network
view(net)
%Plots
% Uncomment these lines to enable various plots.
figure, plotperform(tr)
figure, plottrainstate(tr)
figure, plotconfusion(targets,outputs)
figure, ploterrhist(errors)
figure, plotregression(targets,outputs)
figure, plotroc(targets,outputs)

Best Answer

Obvious:
1. plot(x,t,'.') to estimate how much training data is really needed to adequately characterize the classes AND to identify and remove or modify outliers
2. Then the short answer is to increase the number of hidden nodes, H, AND for each value of H, loop over multiple (10?) designs with different random initial weights. For examples search using
greg Hmax Ntrials
( where Hmax << Hub = -1 + ceil( (Ntrn-1)/3) ~ round(Ntrn/3) )
ADDITIONAL COMMENTS:
3. TRAINSCG is preferred for classification unless the necessary minimum value of Ntrn is huge. Then TRAINRP is preferred.
4. It may be worthwhile (OR JUST INTERESTING) to
a. Compare the default properties of both
net = trainscg % No semicolon
net = trainrp
b. See how large Ntrn can be before TRAINRP has to be used.
5. Delete or comment all statements that specify values that are already defaults (Different for SCG and RP)
6. If Ntrn is sufficiently large, Nval and Ntst will probably not add any new information.
Hope this helps.
Thank you for formally accepting my answer
Greg