MATLAB: MLP Neural network and k-fold cross validation

mlpneural network

I want to train and test MLP Neural network by using k-fold cross validation and train the network by using differential evolution algorithm traindiffevol . this network to predict breast cancer.
first is my code is correct regarding train and test using k-fold cross validation ?
Second I couldn't figure out how to Set NET.trainFcn to 'traindiffevol' could anyone help me ?
close all, clear all, clc, plt=0;
tic
k = 10 % k-fold
%%%%%%%%%%%%%%%%%%%%%%%%%%

[x,t] = iris_dataset;
if iscell( iris_dataset) ;
x = cell2mat(x); % array
t = cell2mat(t);
end
[ I N ] = size(x) %[ 1 94 ]get row size with size() function get 2 dimension
[O N ] = size(t) % [ 1 94 ]
MSE00 = mean(var(t',1)) % 8.34 Biased Reference MSE00a is the MSE "a"djusted for the loss in estimation degrees of freedom caused by the bias of evaluating the MSE with the same data that was used to build the model.
MSE00a = mean(var(t',0)) % 8.43 Unbiased Reference
whos
%%%%%%%%%%%%%%%%%%%%%%%%%%%
rng('default') % Or substitute your lucky number
ind0 = randperm(N);
% ind0 = 1:N; % For debugging
M = floor(N/k) %

Ntrn = N-2*M % length(trnind)
Ntrneq = Ntrn*O % No. of training equations
H = 10;
Nw = (I+1)*H+(H+1)*O % No. of unknown weights
Ndof = Ntrneq-Nw % No. of estimation degrees of freedom
MSEgoal = 0.01*MSE00 %
MinGrad = MSEgoal/100 %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%



% Create a Pattern Recognition Network
net = patternnet(H);
net.trainParam.goal = MSEgoal;
net.trainParam.min_grad = MinGrad;
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Input and Output Pre/Post-Processing Functions
%net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
% net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'};
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
k=10;
net.divideFcn = 'divideind';
cvFolds = crossvalind('Kfold', size(t,2), k); %# get indices of 10-fold CV
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
for i = 1:k %# for each fold
rngstate(i) = rng;
net = configure(net,x,t);
testIdx = (cvFolds == i); %# get indices of test instances
trainIdx = ~testIdx ; %# get indices training instances
trInd=find(trainIdx)
tstInd=find(testIdx)
net.trainParam.epochs = 100;
net.divideParam.trainInd=trInd
net.divideParam.testInd=tstInd
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Choose a Performance Function
net.performFcn = 'mse'; % Mean squared error
% Train the Network
[net,tr] = train( net, x, t );
%# test using test instances
outputs = net(x);
errors = gsubtract(t,outputs);
performance = perform(net,t,outputs)
trainTargets = t .* tr.trainMask{1};
testTargets = t .* tr.testMask{1};
trainPerformance = perform(net,trainTargets,outputs)
testPerformance = perform(net,testTargets,outputs)
test(k)=testPerformance;
%%%%%%%%%%%%%%%%%%%%%%%%%%
save net
stopcrit{i,1} = tr.stop;
bestepoch(i,1) = tr.best_epoch;
R2trn(i,1) = 1 - tr.best_perf/MSE00;
R2trna(i,1) = 1 - (Ntrneq/Ndof)* tr.best_perf/MSE00a;
R2val(i,1) = 1 - tr.best_vperf/MSE00;
R2tst(i,1) = 1 - tr.best_tperf/MSE00;
end
figure, plotconfusion(t,outputs)
accuracy=mean(test);
% View the Network
view(net)
stopcrit = stopcrit
result = [ bestepoch R2trn R2trna R2val R2tst]
minresult = min(result)
meanresult = mean(result)
medresult = median(result)
stdresult = std(result)
maxresult = max(result)
Elapsedtime = toc %3.87 sec

Best Answer

No. Your code is not correct. You have chosen a HIGH-DIMENSIONAL-CLASSIFICATION DATASET for which that version of my code is innapropriate.
Go to
help nndatasets
doc nndatasets
and choose a regression/curvefitting dataset with much lower dimensions.
After that you might want to consider a low dimensional classification/patternrecognition dataset.
Hope this helps.
Thank you for formally accepting my answer
Greg