Documentation

This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English version of the page.

Note: This page has been translated by MathWorks. Click here to see
To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

BayesianOptimization

Bayesian optimization results

Description

A BayesianOptimization object contains the results of a Bayesian optimization. It is the output of bayesopt or a fit function that accepts the OptimizeHyperparameters name-value pair such as fitcdiscr. In addition, a BayesianOptimization object contains data for each iteration of bayesopt that can be accessed by a plot function or an output function.

Creation

Create a BayesianOptimization object using the bayesopt function or a fit function with the OptimizeHyperparameters name-value pair.

Properties

expand all

Problem Definition Properties

This property is read-only.

ObjectiveFcn argument that bayesopt used, returned as a function handle.

  • If you called bayesopt directly, ObjectiveFcn is the bayesopt objective function argument.

  • If you called a fit function with the OptimizeHyperparameters name-value pair, ObjectiveFcn is the logarithm of one plus the cross-validation loss.

Data Types: function_handle

This property is read-only.

VariableDescriptions argument that bayesopt used, returned as a vector of optimizableVariable objects.

  • If you called bayesopt directly, VariableDescriptions is the bayesopt variable description argument.

  • If you called a fit function with the OptimizeHyperparameters name-value pair, VariableDescriptions is the vector of hyperparameters.

This property is read-only.

Options that bayesopt used, returned as a structure.

  • If you called bayesopt directly, Options is the options used in bayesopt, which are the name-value pairs See bayesopt Input Arguments.

  • If you called a fit function with the OptimizeHyperparameters name-value pair, Options are the default bayesopt options, modified by the HyperparameterOptimizationOptions name-value pair.

Options is a read-only structure containing the following fields.

Option NameMeaning
AcquisitionFunctionNameAcquisition function name. See Acquisition Function Types.
IsObjectiveDeterministictrue means the objective function is deterministic, false otherwise.
ExplorationRatioUsed only when AcquisitionFunctionName is 'expected-improvement-plus' or 'expected-improvement-per-second-plus'. See Plus.
  
MaxObjectiveEvaluationsObjective function evaluation limit.
MaxTimeTime limit.
  
XConstraintFcnDeterministic constraints on variables. See Deterministic Constraints — XConstraintFcn.
ConditionalVariableFcnConditional variable constraints. See Conditional Constraints — ConditionalVariableFcn.
NumCoupledConstraintsNumber of coupled constraints. See Coupled Constraints.
CoupledConstraintTolerancesCoupled constraint tolerances. See Coupled Constraints.
AreCoupledConstraintsDeterministicLogical vector specifying whether each coupled constraint is deterministic.
  
VerboseCommand-line display level.
OutputFcnFunction called after each iteration. See Bayesian Optimization Output Functions.
SaveVariableNameVariable name for the @assignInBase output function.
SaveFileNameFile name for the @saveToFile output function.
PlotFcnPlot function called after each iteration. See Bayesian Optimization Plot Functions
  
InitialXPoints where bayesopt evaluated the objective function.
InitialObjectiveObjective function values at InitialX.
InitialConstraintViolationsCoupled constraint function values at InitialX.
InitialErrorValuesError values at InitialX.
InitialObjectiveEvaluationTimesObjective function evaluation times at InitialX.
InitialIterationTimesTime for each iteration, including objective function evaluation and other computations.

Data Types: struct

Solution Properties

This property is read-only.

Minimum observed value of objective function, returned as a real scalar. When there are coupled constraints or evaluation errors, this value is the minimum over all observed points that are feasible according to the final constraint and Error models.

Data Types: double

This property is read-only.

Observed point with minimum objective function value, returned as a 1-by-D table, where D is the number of variables.

Data Types: table

This property is read-only.

Minimum estimated value of objective function, returned as a real scalar. MinEstimatedObjective uses the final objective model.

MinEstimatedObjective is the same as the CriterionValue result of bestPoint with default criterion.

Data Types: double

This property is read-only.

Point with minimum estimated objective function value, returned as a 1-by-D table, where D is the number of variables. XAtMinEstimatedObjective uses the final objective model.

Data Types: table

This property is read-only.

Number of objective function evaluations, returned as a positive integer. This includes the initial evaluations to form a posterior model as well as evaluation during the optimization iterations.

Data Types: double

This property is read-only.

Total elapsed time of optimization in seconds, returned as a positive scalar.

Data Types: double

This property is read-only.

Next point to evaluate if optimization continues, returned as a 1-by-D table, where D is the number of variables.

Data Types: table

Trace Properties

This property is read-only.

Points where the objective function was evaluated, returned as a T-by-D table, where T is the number of evaluation points and D is the number of variables.

Data Types: table

This property is read-only.

Objective function values, returned as a column vector of length T, where T is the number of evaluation points. ObjectiveTrace contains the history of objective function evaluations.

Data Types: double

This property is read-only.

Objective function evaluation times, returned as a column vector of length T, where T is the number of evaluation points. ObjectiveEvaluationTimeTrace includes the time in evaluating coupled constraints, because the objective function computes these constraints.

Data Types: double

This property is read-only.

Iteration times, returned as a column vector of length T, where T is the number of evaluation points. IterationTimeTrace includes both objective function evaluation time and other overhead.

Data Types: double

This property is read-only.

Coupled constraint values, returned as a T-by-K array, where T is the number of evaluation points and K is the number of coupled constraints.

Data Types: double

This property is read-only.

Error indications, returned as a column vector of length T of -1 or 1 entries, where T is the number of evaluation points. Each 1 entry indicates that the objective function errored or returned NaN on the corresponding point in XTrace. Each -1 entry indicates that the objective function value was computed.

Data Types: double

This property is read-only.

Feasibility indications, returned as a logical column vector of length T, where T is the number of evaluation points. Each 1 entry indicates that the final constraint model predicts feasibility at the corresponding point in XTrace.

Data Types: logical

This property is read-only.

Probability that evaluation point is feasible, returned as a column vector of length T, where T is the number of evaluation points. The probabilities come from the final constraint model, including the error constraint model, on the corresponding points in XTrace.

Data Types: double

This property is read-only.

Which evaluation gave minimum feasible objective, returned as a column vector of integer indices of length T, where T is the number of evaluation points. Feasibility is determined with respect to the constraint models that existed at each iteration, including the error constraint model.

Data Types: double

This property is read-only.

Minimum observed objective, returned as a column vector of integer indices of length T, where T is the number of evaluation points.

Data Types: double

This property is read-only.

Minimum estimated objective, returned as a column vector of integer indices of length T, where T is the number of evaluation points. The estimated objective at each iteration is determined with respect to the objective model that existed at that iteration.

Data Types: double

This property is read-only.

Auxiliary data from the objective function, returned as a cell array of length T, where T is the number of evaluation points. Each entry in the cell array is the UserData returned in the third output of the objective function.

Data Types: cell

Object Functions

bestPointBest point in a Bayesian optimization according to a criterion
plotPlot Bayesian optimization results
predictConstraintsPredict coupled constraint violations at a set of points
predictErrorPredict error value at a set of points
predictObjectivePredict objective function at a set of points
predictObjectiveEvaluationTimePredict objective function run times at a set of points
resumeResume a Bayesian optimization

Examples

collapse all

This example shows how to create a BayesianOptimization object by using bayesopt to minimize cross-validation loss.

Optimize hyperparameters of a KNN classifier for the ionosphere data, that is, find KNN hyperparameters that minimize the cross-validation loss. Have bayesopt minimize over the following hyperparameters:

  • Nearest-neighborhood sizes from 1 to 30

  • Distance functions 'chebychev', 'euclidean', and 'minkowski'.

For reproducibility, set the random seed, set the partition, and set the AcquisitionFunctionName option to 'expected-improvement-plus'. Set options give no iterative display.

load ionosphere
rng default
num = optimizableVariable('n',[1,30],'Type','integer');
dst = optimizableVariable('dst',{'chebychev','euclidean','minkowski'},'Type','categorical');
c = cvpartition(351,'Kfold',5);
fun = @(x)kfoldLoss(fitcknn(X,Y,'CVPartition',c,'NumNeighbors',x.n,...
    'Distance',char(x.dst),'NSMethod','exhaustive'));
results = bayesopt(fun,[num,dst],'Verbose',0,...
    'AcquisitionFunctionName','expected-improvement-plus')

results = 
  BayesianOptimization with properties:

                      ObjectiveFcn: [function_handle]
              VariableDescriptions: [1x2 optimizableVariable]
                           Options: [1x1 struct]
                      MinObjective: 0.1197
                   XAtMinObjective: [1x2 table]
             MinEstimatedObjective: 0.1213
          XAtMinEstimatedObjective: [1x2 table]
           NumObjectiveEvaluations: 30
                  TotalElapsedTime: 122.6786
                         NextPoint: [1x2 table]
                            XTrace: [30x2 table]
                    ObjectiveTrace: [30x1 double]
                  ConstraintsTrace: []
                     UserDataTrace: {30x1 cell}
      ObjectiveEvaluationTimeTrace: [30x1 double]
                IterationTimeTrace: [30x1 double]
                        ErrorTrace: [30x1 double]
                  FeasibilityTrace: [30x1 logical]
       FeasibilityProbabilityTrace: [30x1 double]
               IndexOfMinimumTrace: [30x1 double]
             ObjectiveMinimumTrace: [30x1 double]
    EstimatedObjectiveMinimumTrace: [30x1 double]

This example shows how to minimize the cross-validation loss in the ionosphere data using Bayesian optimization of an SVM classifier.

Load the data.

load ionosphere

Optimize the classification using the 'auto' parameters.

rng default % For reproducibility
Mdl = fitcsvm(X,Y,'OptimizeHyperparameters','auto')

|=====================================================================================================|
| Iter | Eval   | Objective   | Objective   | BestSoFar   | BestSoFar   | BoxConstrain-|  KernelScale |
|      | result |             | runtime     | (observed)  | (estim.)    | t            |              |
|=====================================================================================================|
|    1 | Best   |     0.23362 |      59.037 |     0.23362 |     0.23362 |       64.836 |    0.0015729 |
|    2 | Accept |     0.35897 |     0.49245 |     0.23362 |     0.24142 |     0.036335 |       5.5755 |
|    3 | Best   |     0.13105 |       24.94 |     0.13105 |     0.14119 |    0.0022147 |    0.0023957 |
|    4 | Accept |     0.35897 |     0.28181 |     0.13105 |     0.13108 |       5.1259 |        98.62 |
|    5 | Accept |     0.35897 |     0.41605 |     0.13105 |     0.13108 |    0.0010046 |       994.21 |
|    6 | Best   |     0.12536 |     0.43468 |     0.12536 |     0.12539 |    0.0010108 |     0.020742 |
|    7 | Accept |     0.13105 |      4.2278 |     0.12536 |     0.12534 |     0.063079 |     0.023776 |
|    8 | Accept |     0.13105 |      1.0739 |     0.12536 |     0.12666 |    0.0030347 |     0.010003 |
|    9 | Accept |      0.1339 |     0.47299 |     0.12536 |     0.12677 |    0.0010473 |     0.012003 |
|   10 | Best   |     0.12536 |     0.20367 |     0.12536 |     0.12537 |    0.0020091 |     0.057191 |
|   11 | Accept |     0.12536 |     0.40847 |     0.12536 |     0.12402 |    0.0027812 |     0.034303 |
|   12 | Best   |     0.12251 |     0.31775 |     0.12251 |     0.12304 |    0.0010123 |     0.043059 |
|   13 | Accept |     0.12821 |     0.20358 |     0.12251 |     0.12405 |     0.001008 |      0.04533 |
|   14 | Accept |     0.12536 |      0.3849 |     0.12251 |     0.12444 |     0.003481 |     0.039997 |
|   15 | Accept |     0.12536 |     0.20608 |     0.12251 |     0.12451 |    0.0010197 |     0.037525 |
|   16 | Accept |     0.12536 |     0.37084 |     0.12251 |     0.12471 |    0.0024626 |     0.035317 |
|   17 | Accept |     0.12821 |     0.19632 |     0.12251 |     0.12492 |    0.0010223 |     0.038797 |
|   18 | Accept |     0.12536 |     0.38662 |     0.12251 |      0.1249 |     0.004966 |     0.048415 |
|   19 | Accept |     0.17094 |     0.18767 |     0.12251 |     0.12504 |    0.0010018 |      0.34195 |
|   20 | Accept |     0.35897 |     0.20972 |     0.12251 |     0.12507 |        108.3 |       997.68 |
|=====================================================================================================|
| Iter | Eval   | Objective   | Objective   | BestSoFar   | BestSoFar   | BoxConstrain-|  KernelScale |
|      | result |             | runtime     | (observed)  | (estim.)    | t            |              |
|=====================================================================================================|
|   21 | Accept |     0.12821 |     0.25725 |     0.12251 |       0.125 |    0.0010021 |     0.066273 |
|   22 | Accept |     0.12536 |     0.27293 |     0.12251 |     0.12493 |       998.99 |       37.817 |
|   23 | Accept |     0.13105 |      2.2909 |     0.12251 |     0.12493 |       999.95 |       4.6001 |
|   24 | Accept |     0.12251 |     0.32079 |     0.12251 |     0.12203 |       992.31 |       16.075 |
|   25 | Accept |     0.12536 |     0.43439 |     0.12251 |     0.12363 |        976.8 |       20.628 |
|   26 | Accept |     0.12536 |     0.49664 |     0.12251 |     0.12392 |       996.56 |       13.208 |
|   27 | Accept |     0.12821 |      0.2473 |     0.12251 |     0.12422 |       994.76 |        29.43 |
|   28 | Accept |     0.12536 |     0.48935 |     0.12251 |     0.12424 |    0.0054643 |     0.030792 |
|   29 | Accept |     0.12536 |     0.39315 |     0.12251 |     0.12444 |       997.35 |       15.802 |
|   30 | Accept |     0.12536 |     0.37616 |     0.12251 |     0.12459 |       999.99 |       21.296 |

__________________________________________________________
Optimization completed.
MaxObjectiveEvaluations of 30 reached.
Total function evaluations: 30
Total elapsed time: 162.2789 seconds.
Total objective function evaluation time: 100.0319

Best observed feasible point:
    BoxConstraint    KernelScale
    _____________    ___________

      0.0010123       0.043059  

Observed objective function value = 0.12251
Estimated objective function value = 0.12459
Function evaluation time = 0.31775

Best estimated feasible point (according to models):
    BoxConstraint    KernelScale
    _____________    ___________

       999.99          21.296   

Estimated objective function value = 0.12459
Estimated function evaluation time = 0.33706
Mdl = 
  ClassificationSVM
                         ResponseName: 'Y'
                CategoricalPredictors: []
                           ClassNames: {'b'  'g'}
                       ScoreTransform: 'none'
                      NumObservations: 351
    HyperparameterOptimizationResults: [1x1 BayesianOptimization]
                                Alpha: [93x1 double]
                                 Bias: -5.2947
                     KernelParameters: [1x1 struct]
                       BoxConstraints: [351x1 double]
                      ConvergenceInfo: [1x1 struct]
                      IsSupportVector: [351x1 logical]
                               Solver: 'SMO'


  Properties, Methods

The fit achieved about 12% loss for the default 5-fold cross validation.

Examine the BayesianOptimization object that is returned in the HyperparameterOptimizationResults property of the returned model.

disp(Mdl.HyperparameterOptimizationResults)
  BayesianOptimization with properties:

                      ObjectiveFcn: @createObjFcn/theObjFcn
              VariableDescriptions: [5x1 optimizableVariable]
                           Options: [1x1 struct]
                      MinObjective: 0.1225
                   XAtMinObjective: [1x2 table]
             MinEstimatedObjective: 0.1246
          XAtMinEstimatedObjective: [1x2 table]
           NumObjectiveEvaluations: 30
                  TotalElapsedTime: 162.2789
                         NextPoint: [1x2 table]
                            XTrace: [30x2 table]
                    ObjectiveTrace: [30x1 double]
                  ConstraintsTrace: []
                     UserDataTrace: {30x1 cell}
      ObjectiveEvaluationTimeTrace: [30x1 double]
                IterationTimeTrace: [30x1 double]
                        ErrorTrace: [30x1 double]
                  FeasibilityTrace: [30x1 logical]
       FeasibilityProbabilityTrace: [30x1 double]
               IndexOfMinimumTrace: [30x1 double]
             ObjectiveMinimumTrace: [30x1 double]
    EstimatedObjectiveMinimumTrace: [30x1 double]

Introduced in R2016b

Was this topic helpful?