I'm trying to use Bayesian Optimization for my custom neural network, but when following the tutorials, it isn't clear how I can use BayesOpt with my own network.
My current understanding is I need to use the Validation and Training losses as inputs to the objective function, but I'm sort of at a loss on how to do this.
What I'm Trying to Tune:
- Hyperparameters (max epochs (just to get in the neighborhood), minibatchsize, initial learning rates, etc.)
- Number of hidden layers
- The size of my fully connected layers
Right now I'm doing this iteratively, but I'm looking for a more optimal solution (hence BayesOpt)
This is what I have setup currently in order to iterate over different epochs, hidden layers, and the size of fully connected layers. I know I can add in the other hyperparameters to trainingOptions, but this is all I'm iterating over at the moment, and am leaving the rest to default values.
Thoughts?
function [net,tr] = betNet(X,y,X_test,y_test,X_cv,y_cv,maxE,NHL,fcls)%fcls = Fully Connected Layer Size
%NHL = Number of Hidden Layers
%maxE = Maximum Epochs
%% ===== Setting up DNN =====
%Sets up our FCL
fcl1 = fullyConnectedLayer(fcls,'BiasInitializer','narrow-normal');fcl2 = fullyConnectedLayer(2,'BiasInitializer','ones');ip = sequenceInputLayer(size(X,1));sml = softmaxLayer('Name','sml');options = trainingOptions('adam',... 'MaxEpochs',maxE,... 'ExecutionEnvironment','gpu',... 'Shuffle','every-epoch',... 'MiniBatchSize',64,... 'ValidationFrequency',50,... 'ValidationData',{X_cv,y_cv})% 'Plots','training-progress',...
% layers = [repmat(fcl,1,10) sigmoidLayer classificationLayer]
layers = [ip repmat(fcl1,1,NHL) fcl2 softmaxLayer classificationLayer];%% ===== Training NN =====
[net,tr] = trainNetwork(X,y,layers,options);end
Best Answer