In case anybody else is looking for a solution. I used the crossval function to wrap the training of the decision tree. This way the implementation of other loss functions is straightforward. function [trainedClassifier, qualityMeasures] = trainDTwCrossVal(data, predictorNames, MaxNumSplits)
numberOfFolds=5;
cp = cvpartition(data.typeBehavior,'k',numberOfFolds);
vals=crossval(@trainDT2, data, 'partition', cp);
function testval = trainDT2(trainingData, testingData)
Testval are quality measures of the prediction, derived from the confusion matrix, calculated inside the nested function to train the decision tree.
TP=C(1,1); FP=C(1,2); FN=C(2,1); TN=C(2,2);
if ( (TP+FP)*(TP+FN)*(TN+FP)*(TN+FN) ) == 0
MCC = 0;
else
MCC = (TP*TN - FP*FN) / ...
sqrt( (TP+FP)*(TP+FN)*(TN+FP)*(TN+FN) );
end
accuracy=(TP+TN)/(TP+TN+FP+FN);
F1score=2*TP/(2*TP+FP+FN);
testval=[accuracy F1score MCC];
Best Answer