There are two ways you could do this. One of them involves "bayesopt" function.
(1) The easiest way is to use the "fitrgp" function. There is an example of how to optimize a fitrgp hyperparameter (Sigma) on the "fitrgp" function Documentation page:
To optimize ‘KernelFunction’, using default values for all other hyperparameters, you could do the following (using data from the documentation example):
load(fullfile(matlabroot,'examples','stats','gprdata2.mat'))
model1 = fitrgp(x,y,'OptimizeHyperparameters',{'KernelFunction'});
Because fitrgp tends to be sensitive to the Sigma parameter, you will probably get a better result if you simultaneously optimize KernelFunction and Sigma. Also, searching categorical variables inherently requires more function evaluations, so you should probably run it longer:
model2 = fitrgp(x,y,'OptimizeHyperparameters',{'KernelFunction','Sigma'},...
'HyperparameterOptimizationOptions', struct('MaxObjectiveEvaluations',100));
The "fitrgp" function documentation page lists the hyperparameters that are eligible for optimization, and what values are searched:
You can use this model to make predictions on unseen data directly with the "predict" function:
model2 = fitrgp(x,y,'OptimizeHyperparameters',{'KernelFunction','Sigma'},...
'HyperparameterOptimizationOptions', struct('MaxObjectiveEvaluations',100));
xNew = 0.2
ypred = predict(model2, xNew)
(2) A second approach would be to use "bayesopt" function directly, defining your own objective function. Here, the objective function is the 5-fold crossvalidation loss of the model.
load(fullfile(matlabroot,'examples','stats','gprdata2.mat'))
kernel = optimizableVariable('KernelFunction',{'exponential','squaredexponential','matern32','matern52',...
'rationalquadratic','ardexponential','ardsquaredexponential','ardmatern32','ardmatern52','ardrationalquadratic'},...
'Type','categorical')
bo = bayesopt(@(T)objFcn(T,x,y), kernel)
function Loss = objFcn(Vars, x, y)
m = fitrgp(x, y, 'KernelFunction', char(Vars.KernelFunction), 'KFold', 5);Loss = kfoldLoss(m);
end
(2.1) To optimize both KernelFunction and Sigma using "bayesopt" function, you can do the following:
kernel = optimizableVariable('KernelFunction',{'exponential','squaredexponential','matern32','matern52',...
'rationalquadratic','ardexponential','ardsquaredexponential','ardmatern32','ardmatern52','ardrationalquadratic'},...
'Type','categorical')
sigma = optimizableVariable('Sigma',[1e-4,10],'Transform','log')
bo = bayesopt(@(T)objFcn(T,x,y), [sigma, kernel], 'MaxObjectiveEvaluations', 100)
function Loss = objFcn(Vars, x, y)
m = fitrgp(x, y, 'KernelFunction', char(Vars.KernelFunction), ...
'Sigma', Vars.Sigma, 'ConstantSigma', true,...
'KFold', 5);
Loss = kfoldLoss(m);
end
To get a model that can make predictions, you need to fit a model without passing the ‘kfold’ argument, but keeping the optimal kernel function obtained from "bayesopt" function.
kernel= bo.XAtMinObjective.KernelFunction
model = fitrgp(x, y, 'KernelFunction', char(kernel));
xNew = 0.2
ypred = predict(model2, xNew)
Best Answer