Note: This page has been translated by MathWorks. Please click here

To view all translated materials including this page, select Japan from the country navigator on the bottom of this page.

To view all translated materials including this page, select Japan from the country navigator on the bottom of this page.

**MathWorks Machine Translation**

The automated translation of this page is provided by a general purpose third party translator tool.

MathWorks does not warrant, and disclaims all liability for, the accuracy, suitability, or fitness for purpose of the translation.

Bayesian optimization results

A `BayesianOptimization`

object contains
the results of a Bayesian optimization. It is the output of `bayesopt`

or a fit function that accepts
the `OptimizeHyperparameters`

name-value pair such
as `fitcdiscr`

. In addition, a `BayesianOptimization`

object
contains data for each iteration of `bayesopt`

that
can be accessed by a plot function or an output function.

Create a `BayesianOptimization`

object using
the `bayesopt`

function or a fit
function with the `OptimizeHyperparameters`

name-value
pair.

`ObjectiveFcn`

— `ObjectiveFcn`

argument that `bayesopt`

usedfunction handle

`ObjectiveFcn`

argument that `bayesopt`

used,
returned as a function handle.

If you called

`bayesopt`

directly,`ObjectiveFcn`

is the`bayesopt`

objective function argument.If you called a fit function with the

`OptimizeHyperparameters`

name-value pair,`ObjectiveFcn`

is the logarithm of one plus the cross-validation loss.

**Data Types: **`function_handle`

`VariableDescriptions`

— `VariableDescriptions`

argument that `bayesopt`

usedvector of

`optimizableVariable`

objects`VariableDescriptions`

argument that `bayesopt`

used,
returned as a vector of `optimizableVariable`

objects.

If you called

`bayesopt`

directly,`VariableDescriptions`

is the`bayesopt`

variable description argument.If you called a fit function with the

`OptimizeHyperparameters`

name-value pair,`VariableDescriptions`

is the vector of hyperparameters.

`Options`

— Options that `bayesopt`

usedstructure

Options that `bayesopt`

used, returned as a
structure.

If you called

`bayesopt`

directly,`Options`

is the options used in`bayesopt`

, which are the name-value pairs See`bayesopt`

Input Arguments.If you called a fit function with the

`OptimizeHyperparameters`

name-value pair,`Options`

are the default`bayesopt`

options, modified by the`HyperparameterOptimizationOptions`

name-value pair.

`Options`

is a read-only structure containing
the following fields.

Option Name | Meaning |
---|---|

`AcquisitionFunctionName` | Acquisition function name. See Acquisition Function Types. |

`IsObjectiveDeterministic` | `true` means the objective function is deterministic, `false` otherwise. |

`ExplorationRatio` | Used only when `AcquisitionFunctionName` is `'expected-improvement-plus'` or `'expected-improvement-per-second-plus'` .
See Plus. |

`MaxObjectiveEvaluations` | Objective function evaluation limit. |

`MaxTime` | Time limit. |

`XConstraintFcn` | Deterministic constraints on variables. See Deterministic Constraints — XConstraintFcn. |

`ConditionalVariableFcn` | Conditional variable constraints. See Conditional Constraints — ConditionalVariableFcn. |

`NumCoupledConstraints` | Number of coupled constraints. See Coupled Constraints. |

`CoupledConstraintTolerances` | Coupled constraint tolerances. See Coupled Constraints. |

`AreCoupledConstraintsDeterministic` | Logical vector specifying whether each coupled constraint is deterministic. |

`Verbose` | Command-line display level. |

`OutputFcn` | Function called after each iteration. See Bayesian Optimization Output Functions. |

`SaveVariableName` | Variable name for the `@assignInBase` output
function. |

`SaveFileName` | File name for the `@saveToFile` output function. |

`PlotFcn` | Plot function called after each iteration. See Bayesian Optimization Plot Functions |

`InitialX` | Points where `bayesopt` evaluated the objective
function. |

`InitialObjective` | Objective function values at `InitialX` . |

`InitialConstraintViolations` | Coupled constraint function values at `InitialX` . |

`InitialErrorValues` | Error values at `InitialX` . |

`InitialObjectiveEvaluationTimes` | Objective function evaluation times at `InitialX` . |

`InitialIterationTimes` | Time for each iteration, including objective function evaluation and other computations. |

**Data Types: **`struct`

`MinObjective`

— Minimum observed value of objective functionreal scalar

Minimum observed value of objective function, returned as a real scalar. When there are coupled constraints or evaluation errors, this value is the minimum over all observed points that are feasible according to the final constraint and Error models.

**Data Types: **`double`

`XAtMinObjective`

— Observed point with minimum objective function value`1`

-by-`D`

tableObserved point with minimum objective function value, returned
as a `1`

-by-`D`

table, where `D`

is
the number of variables.

**Data Types: **`table`

`MinEstimatedObjective`

— Minimum estimated value of objective functionreal scalar

Minimum estimated value of objective function, returned as a
real scalar. `MinEstimatedObjective`

uses the final
objective model.

`MinEstimatedObjective`

is the same as the `CriterionValue`

result
of `bestPoint`

with default criterion.

**Data Types: **`double`

`XAtMinEstimatedObjective`

— Point with minimum estimated objective function value`1`

-by-`D`

tablePoint with minimum estimated objective function value, returned
as a `1`

-by-`D`

table, where `D`

is
the number of variables. `XAtMinEstimatedObjective`

uses
the final objective model.

**Data Types: **`table`

`NumObjectiveEvaluations`

— Number of objective function evaluationspositive integer

Number of objective function evaluations, returned as a positive integer. This includes the initial evaluations to form a posterior model as well as evaluation during the optimization iterations.

**Data Types: **`double`

`TotalElapsedTime`

— Total elapsed time of optimization in secondspositive scalar

Total elapsed time of optimization in seconds, returned as a positive scalar.

**Data Types: **`double`

`NextPoint`

— Next point to evaluate if optimization continues`1`

-by-`D`

tableNext point to evaluate if optimization continues, returned as
a `1`

-by-`D`

table, where `D`

is
the number of variables.

**Data Types: **`table`

`XTrace`

— Points where the objective function was evaluated`T`

-by-`D`

tablePoints where the objective function was evaluated, returned
as a `T`

-by-`D`

table, where `T`

is
the number of evaluation points and `D`

is the number
of variables.

**Data Types: **`table`

`ObjectiveTrace`

— Objective function valuescolumn vector of length

`T`

Objective function values, returned as a column vector of length `T`

,
where `T`

is the number of evaluation points. `ObjectiveTrace`

contains
the history of objective function evaluations.

**Data Types: **`double`

`ObjectiveEvaluationTimeTrace`

— Objective function evaluation timescolumn vector of length

`T`

Objective function evaluation times, returned as a column vector
of length `T`

, where `T`

is the
number of evaluation points. `ObjectiveEvaluationTimeTrace`

includes
the time in evaluating coupled constraints, because the objective
function computes these constraints.

**Data Types: **`double`

`IterationTimeTrace`

— Iteration timescolumn vector of length

`T`

Iteration times, returned as a column vector of length `T`

,
where `T`

is the number of evaluation points. `IterationTimeTrace`

includes
both objective function evaluation time and other overhead.

**Data Types: **`double`

`ConstraintsTrace`

— Coupled constraint values`T`

-by-`K`

arrayCoupled constraint values, returned as a `T`

-by-`K`

array,
where `T`

is the number of evaluation points and `K`

is
the number of coupled constraints.

**Data Types: **`double`

`ErrorTrace`

— Error indicationscolumn vector of length

`T`

of `-1`

or `1`

entriesError indications, returned as a column vector of length `T`

of
`-1`

or `1`

entries, where
`T`

is the number of evaluation points. Each
`1`

entry indicates that the objective function errored
or returned `NaN`

on the corresponding point in `XTrace`

. Each
`-1`

entry indicates that the objective function value
was computed.

**Data Types: **`double`

`FeasibilityTrace`

— Feasibility indicationslogical column vector of length

`T`

Feasibility indications, returned as a logical column vector of length `T`

,
where `T`

is the number of evaluation points. Each
`1`

entry indicates that the final constraint model
predicts feasibility at the corresponding point in `XTrace`

.

**Data Types: **`logical`

`FeasibilityProbabilityTrace`

— Probability that evaluation point is feasiblecolumn vector of length

`T`

Probability that evaluation point is feasible, returned as a column vector of length
`T`

, where `T`

is the number of
evaluation points. The probabilities come from the final constraint model,
including the error constraint model, on the corresponding points in
`XTrace`

.

**Data Types: **`double`

`IndexOfMinimumTrace`

— Which evaluation gave minimum feasible objectivecolumn vector of integer indices of length

`T`

Which evaluation gave minimum feasible objective, returned as
a column vector of integer indices of length `T`

,
where `T`

is the number of evaluation points. Feasibility
is determined with respect to the constraint models that existed at
each iteration, including the error constraint model.

**Data Types: **`double`

`ObjectiveMinimumTrace`

— Minimum observed objectivecolumn vector of length

`T`

Minimum observed objective, returned as a column vector of integer
indices of length `T`

, where `T`

is
the number of evaluation points.

**Data Types: **`double`

`EstimatedObjectiveMinimumTrace`

— Minimum estimated objectivecolumn vector of length

`T`

Minimum estimated objective, returned as a column vector of
integer indices of length `T`

, where `T`

is
the number of evaluation points. The estimated objective at each iteration
is determined with respect to the objective model that existed at
that iteration.

**Data Types: **`double`

`UserDataTrace`

— Auxiliary data from the objective functioncell array of length

`T`

Auxiliary data from the objective function, returned as a cell
array of length `T`

, where `T`

is
the number of evaluation points. Each entry in the cell array is the `UserData`

returned
in the third output of the objective function.

**Data Types: **`cell`

`bestPoint` | Best point in a Bayesian optimization according to a criterion |

`plot` | Plot Bayesian optimization results |

`predictConstraints` | Predict coupled constraint violations at a set of points |

`predictError` | Predict error value at a set of points |

`predictObjective` | Predict objective function at a set of points |

`predictObjectiveEvaluationTime` | Predict objective function run times at a set of points |

`resume` | Resume a Bayesian optimization |

`BayesianOptimization`

Object Using `bayesopt`

This example shows how to create a `BayesianOptimization`

object by using `bayesopt`

to minimize cross-validation loss.

Optimize hyperparameters of a KNN classifier for the `ionosphere`

data, that is, find KNN hyperparameters that minimize the cross-validation loss. Have `bayesopt`

minimize over the following hyperparameters:

Nearest-neighborhood sizes from 1 to 30

Distance functions

`'chebychev'`

,`'euclidean'`

, and`'minkowski'`

.

For reproducibility, set the random seed, set the partition, and set the `AcquisitionFunctionName`

option to `'expected-improvement-plus'`

. Set options give no iterative display.

load ionosphere rng default num = optimizableVariable('n',[1,30],'Type','integer'); dst = optimizableVariable('dst',{'chebychev','euclidean','minkowski'},'Type','categorical'); c = cvpartition(351,'Kfold',5); fun = @(x)kfoldLoss(fitcknn(X,Y,'CVPartition',c,'NumNeighbors',x.n,... 'Distance',char(x.dst),'NSMethod','exhaustive')); results = bayesopt(fun,[num,dst],'Verbose',0,... 'AcquisitionFunctionName','expected-improvement-plus')

results = BayesianOptimization with properties: ObjectiveFcn: [function_handle] VariableDescriptions: [1x2 optimizableVariable] Options: [1x1 struct] MinObjective: 0.1197 XAtMinObjective: [1x2 table] MinEstimatedObjective: 0.1213 XAtMinEstimatedObjective: [1x2 table] NumObjectiveEvaluations: 30 TotalElapsedTime: 125.2957 NextPoint: [1x2 table] XTrace: [30x2 table] ObjectiveTrace: [30x1 double] ConstraintsTrace: [] UserDataTrace: {30x1 cell} ObjectiveEvaluationTimeTrace: [30x1 double] IterationTimeTrace: [30x1 double] ErrorTrace: [30x1 double] FeasibilityTrace: [30x1 logical] FeasibilityProbabilityTrace: [30x1 double] IndexOfMinimumTrace: [30x1 double] ObjectiveMinimumTrace: [30x1 double] EstimatedObjectiveMinimumTrace: [30x1 double]

`BayesianOptimization`

Object Using a Fit FunctionThis example shows how to minimize the cross-validation loss in the `ionosphere`

data using Bayesian optimization of an SVM classifier.

Load the data.

```
load ionosphere
```

Optimize the classification using the `'auto'`

parameters.

rng default % For reproducibility Mdl = fitcsvm(X,Y,'OptimizeHyperparameters','auto')

|=====================================================================================================| | Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | BoxConstrain-| KernelScale | | | result | | runtime | (observed) | (estim.) | t | | |=====================================================================================================| | 1 | Best | 0.23362 | 210.54 | 0.23362 | 0.23362 | 64.836 | 0.0015729 | | 2 | Accept | 0.35897 | 2.2563 | 0.23362 | 0.24142 | 0.036335 | 5.5755 | | 3 | Best | 0.13105 | 92.662 | 0.13105 | 0.14119 | 0.0022147 | 0.0023957 | | 4 | Accept | 0.35897 | 1.5797 | 0.13105 | 0.13108 | 5.1259 | 98.62 | | 5 | Accept | 0.13675 | 177.69 | 0.13105 | 0.13111 | 0.0011599 | 0.0010098 | | 6 | Accept | 0.13675 | 4.4416 | 0.13105 | 0.13116 | 0.0010151 | 0.0059137 | | 7 | Accept | 0.1339 | 108.07 | 0.13105 | 0.13121 | 0.0010281 | 0.0027003 | | 8 | Accept | 0.1339 | 151.29 | 0.13105 | 0.13121 | 0.015851 | 0.0013371 | | 9 | Best | 0.12821 | 60.235 | 0.12821 | 0.12831 | 0.024752 | 0.0092557 | | 10 | Accept | 0.1339 | 114.15 | 0.12821 | 0.12842 | 0.017889 | 0.0048413 | | 11 | Accept | 0.13675 | 2.679 | 0.12821 | 0.12843 | 0.037397 | 0.047282 | | 12 | Accept | 0.14815 | 222.74 | 0.12821 | 0.12839 | 937.55 | 0.15836 | | 13 | Accept | 0.12821 | 77.982 | 0.12821 | 0.1282 | 0.43011 | 0.03474 | | 14 | Accept | 0.1339 | 12.453 | 0.12821 | 0.12852 | 0.034996 | 0.020241 | | 15 | Accept | 0.1339 | 20.569 | 0.12821 | 0.12864 | 0.83537 | 0.077285 | | 16 | Accept | 0.13105 | 57.42 | 0.12821 | 0.12975 | 0.51239 | 0.036016 | | 17 | Accept | 0.1339 | 0.50257 | 0.12821 | 0.12984 | 0.0010223 | 0.032979 | | 18 | Accept | 0.1339 | 0.63509 | 0.12821 | 0.1298 | 0.001054 | 0.090485 | | 19 | Best | 0.11966 | 0.68497 | 0.11966 | 0.12734 | 0.0010089 | 0.058999 | | 20 | Accept | 0.12251 | 0.50622 | 0.11966 | 0.12558 | 0.0010217 | 0.053554 | |=====================================================================================================| | Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | BoxConstrain-| KernelScale | | | result | | runtime | (observed) | (estim.) | t | | |=====================================================================================================| | 21 | Accept | 0.12251 | 0.60533 | 0.11966 | 0.1219 | 0.0010938 | 0.055296 | | 22 | Accept | 0.11966 | 0.71648 | 0.11966 | 0.12123 | 0.0010154 | 0.057455 | | 23 | Accept | 0.1453 | 0.46312 | 0.11966 | 0.12125 | 0.0057218 | 0.42819 | | 24 | Accept | 0.12821 | 0.64664 | 0.11966 | 0.12128 | 0.097947 | 0.28975 | | 25 | Best | 0.11681 | 0.79586 | 0.11681 | 0.11702 | 0.33944 | 0.45025 | | 26 | Accept | 0.13105 | 1.152 | 0.11681 | 0.11767 | 2.0269 | 0.42751 | | 27 | Accept | 0.12251 | 0.5282 | 0.11681 | 0.11862 | 0.1678 | 0.53854 | | 28 | Accept | 0.12821 | 0.45155 | 0.11681 | 0.12244 | 0.19775 | 0.42055 | | 29 | Accept | 0.12821 | 0.86329 | 0.11681 | 0.12277 | 0.24871 | 0.4694 | | 30 | Accept | 0.26211 | 0.31681 | 0.11681 | 0.12204 | 0.0063459 | 1.101 | __________________________________________________________ Optimization completed. MaxObjectiveEvaluations of 30 reached. Total function evaluations: 30 Total elapsed time: 1662.9787 seconds. Total objective function evaluation time: 1325.6377 Best observed feasible point: BoxConstraint KernelScale _____________ ___________ 0.33944 0.45025 Observed objective function value = 0.11681 Estimated objective function value = 0.12204 Function evaluation time = 0.79586 Best estimated feasible point (according to models): BoxConstraint KernelScale _____________ ___________ 0.0010154 0.057455 Estimated objective function value = 0.12204 Estimated function evaluation time = 0.6057 Mdl = ClassificationSVM ResponseName: 'Y' CategoricalPredictors: [] ClassNames: {'b' 'g'} ScoreTransform: 'none' NumObservations: 351 HyperparameterOptimizationResults: [1x1 BayesianOptimization] Alpha: [116x1 double] Bias: -2.6346 KernelParameters: [1x1 struct] BoxConstraints: [351x1 double] ConvergenceInfo: [1x1 struct] IsSupportVector: [351x1 logical] Solver: 'SMO'

The fit achieved about 12% loss for the default 5-fold cross validation.

Examine the `BayesianOptimization`

object that is returned in the `HyperparameterOptimizationResults`

property of the returned model.

disp(Mdl.HyperparameterOptimizationResults)

BayesianOptimization with properties: ObjectiveFcn: @createObjFcn/theObjFcn VariableDescriptions: [5x1 optimizableVariable] Options: [1x1 struct] MinObjective: 0.1168 XAtMinObjective: [1x2 table] MinEstimatedObjective: 0.1220 XAtMinEstimatedObjective: [1x2 table] NumObjectiveEvaluations: 30 TotalElapsedTime: 1.6630e+03 NextPoint: [1x2 table] XTrace: [30x2 table] ObjectiveTrace: [30x1 double] ConstraintsTrace: [] UserDataTrace: {30x1 cell} ObjectiveEvaluationTimeTrace: [30x1 double] IterationTimeTrace: [30x1 double] ErrorTrace: [30x1 double] FeasibilityTrace: [30x1 logical] FeasibilityProbabilityTrace: [30x1 double] IndexOfMinimumTrace: [30x1 double] ObjectiveMinimumTrace: [30x1 double] EstimatedObjectiveMinimumTrace: [30x1 double]

`bayesopt`

| `fitcdiscr`

| `fitcecoc`

| `fitcensemble`

| `fitcknn`

| `fitclinear`

| `fitcnb`

| `fitcsvm`

| `fitctree`

| `fitrensemble`

| `fitrgp`

| `fitrlinear`

| `fitrsvm`

| `fitrtree`

You clicked a link that corresponds to this MATLAB command:

Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.

Was this topic helpful?

You can also select a location from the following list:

- América Latina (Español)
- Canada (English)
- United States (English)

- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)

- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)