objective function in Bayesian Optimization Algorithm like fitrsvm and fitrgp
7 views (last 30 days)
What is the mathematical objective function in the bayesian optimization algorithm? The explanation says that the algorithm like fitrsvm tries to minimize the log(1 + cross-validation loss) but what is the real mathematical formula?
Is it possible to change the objective function to just the MSE?
Don Mathis on 13 May 2019
This page says that the loss defaults to MSE. So that's the loss that's used in the log(1+cvloss) formula. Cross validated loss is the loss summed over all the held-out validation sets. The default when using optimization is 5-fold cross-validation.
There's not an option to change the hyperparameter optimization objective function from log(1+cvloss). You would need to edit the source code to do that. The source file is matlab\toolbox\stats\classreg\+classreg\+learning\+paramoptim\createObjFcn.m. Look for the call to the log1p function.