Hello,
since there is no hyperparameter tuning function for neural network I wanted to try the bayesopt function. I tried to recreate the example here: https://de.mathworks.com/help/stats/bayesian-optimization-case-study.html. But this does not work. Is there a possibility to tune the number of hidden neurons? My code does not work…
[m,n] = size(Daten) ;P = 0.7 ;Training = Daten(1:round(P*m),:) ; Testing = Daten(round(P*m)+1:end,:);XTrain=Training(:,1:n-1);YTrain=Training(:,n);XTest=Testing(:,1:n-1);YTest=Testing(:,n);c = cvpartition(YTrain,'KFold',10);hiddenLayerSize=optimizableVariable('hiddenLayerSize',[0,20]);minfn = @(z)kfoldLoss(fitnet(XTrain,YTrain,'CVPartition',c,... 'hiddenLayerSize',z.hiddenLayerSize));results = bayesopt(minfn,hiddenLayerSize,'IsObjectiveDeterministic',true,... 'AcquisitionFunctionName','expected-improvement-plus');
Best Answer