Documentation

This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

regularize

Class: RegressionEnsemble

Find weights to minimize resubstitution error plus penalty term

Syntax

ens1 = regularize(ens)
ens1 = regularize(ens,Name,Value)

Description

ens1 = regularize(ens) finds optimal weights for learners in ens by lasso regularization. regularize returns a regression ensemble identical to ens, but with a populated Regularization property.

ens1 = regularize(ens,Name,Value) computes optimal weights with additional options specified by one or more Name,Value pair arguments. You can specify several name-value pair arguments in any order as Name1,Value1,…,NameN,ValueN.

Input Arguments

ens

A regression ensemble, created by fitrensemble.

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside single quotes (' '). You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.

'lambda'

Vector of nonnegative regularization parameter values for lasso. For the default setting of lambda, regularize calculates the smallest value lambda_max for which all optimal weights for learners are 0. The default value of lambda is a vector including 0 and nine exponentially-spaced numbers from lambda_max/1000 to lambda_max.

Default: [0 logspace(log10(lambda_max/1000),log10(lambda_max),9)]

'MaxIter'

Maximum number of iterations allowed, specified as a positive integer. If the algorithm executes MaxIter iterations before reaching the convergence tolerance, then the function stops iterating and returns a warning message. The function can return more than one warning when either npass or the number of lambda values is greater than 1.

Default: 1e3

'npass'

Maximal number of passes for lasso optimization, a positive integer.

Default: 10

'reltol'

Relative tolerance on the regularized loss for lasso, a numeric positive scalar.

Default: 1e-3

'verbose'

Verbosity level, either 0 or 1. When set to 1, regularize displays more information as it runs.

Default: 0

Output Arguments

ens1

A regression ensemble. Usually you set ens1 to the same name as ens.

Examples

expand all

Regularize an ensemble of bagged trees.

Generate sample data.

rng(10,'twister') % For reproducibility
X = rand(2000,20);
Y = repmat(-1,2000,1);
Y(sum(X(:,1:5),2)>2.5) = 1;

Regularize an ensemble of bagged regression trees.

bag = fitrensemble(X,Y,'Method','Bag','NumLearningCycles',300);
bag = regularize(bag,'lambda',[0.001 0.1],'verbose',1);
Starting lasso minimization for Lambda=0.001. Initial MSE=0.110607.
    Lasso minimization completed pass 1 for Lambda=0.001
        MSE = 0.0899652
        Relative change in MSE = 0.229447
        Number of learners with non-zero weights = 12
    Lasso minimization completed pass 2 for Lambda=0.001
        MSE = 0.064488
        Relative change in MSE = 0.39507
        Number of learners with non-zero weights = 43
    Lasso minimization completed pass 3 for Lambda=0.001
        MSE = 0.0608422
        Relative change in MSE = 0.0599211
        Number of learners with non-zero weights = 64
    Lasso minimization completed pass 4 for Lambda=0.001
        MSE = 0.0600689
        Relative change in MSE = 0.0128732
        Number of learners with non-zero weights = 82
    Lasso minimization completed pass 5 for Lambda=0.001
        MSE = 0.0599416
        Relative change in MSE = 0.00212391
        Number of learners with non-zero weights = 95
    Lasso minimization completed pass 6 for Lambda=0.001
        MSE = 0.0599377
        Relative change in MSE = 6.56533e-05
        Number of learners with non-zero weights = 108
    Lasso minimization completed pass 7 for Lambda=0.001
        MSE = 0.0599377
        Relative change in MSE = 5.37559e-07
        Number of learners with non-zero weights = 109
    Lasso minimization completed pass 8 for Lambda=0.001
        MSE = 0.0599376
        Relative change in MSE = 4.5759e-07
        Number of learners with non-zero weights = 108
    Completed lasso minimization for Lambda=0.001.
    Resubstitution MSE changed from 0.110607 to 0.0599376.
    Number of learners reduced from 300 to 108.
Starting lasso minimization for Lambda=0.1. Initial MSE=0.110607.
    Lasso minimization completed pass 1 for Lambda=0.1
        MSE = 0.113013
        Relative change in MSE = 0.0212885
        Number of learners with non-zero weights = 10
    Lasso minimization completed pass 2 for Lambda=0.1
        MSE = 0.086583
        Relative change in MSE = 0.30526
        Number of learners with non-zero weights = 27
    Lasso minimization completed pass 3 for Lambda=0.1
        MSE = 0.080426
        Relative change in MSE = 0.0765551
        Number of learners with non-zero weights = 42
    Lasso minimization completed pass 4 for Lambda=0.1
        MSE = 0.0795375
        Relative change in MSE = 0.0111715
        Number of learners with non-zero weights = 57
    Lasso minimization completed pass 5 for Lambda=0.1
        MSE = 0.0792384
        Relative change in MSE = 0.00377379
        Number of learners with non-zero weights = 67
    Lasso minimization completed pass 6 for Lambda=0.1
        MSE = 0.0786909
        Relative change in MSE = 0.00695864
        Number of learners with non-zero weights = 75
    Lasso minimization completed pass 7 for Lambda=0.1
        MSE = 0.0787917
        Relative change in MSE = 0.00128042
        Number of learners with non-zero weights = 77
    Lasso minimization completed pass 8 for Lambda=0.1
        MSE = 0.0788015
        Relative change in MSE = 0.000123345
        Number of learners with non-zero weights = 87
    Lasso minimization completed pass 9 for Lambda=0.1
        MSE = 0.0788032
        Relative change in MSE = 2.1627e-05
        Number of learners with non-zero weights = 87
    Completed lasso minimization for Lambda=0.1.
    Resubstitution MSE changed from 0.110607 to 0.0788032.
    Number of learners reduced from 300 to 87.

regularize reports on its progress.

Inspect the resulting regularization structure.

bag.Regularization
ans = struct with fields:
               Method: 'Lasso'
       TrainedWeights: [300x2 double]
               Lambda: [1.0000e-03 0.1000]
    ResubstitutionMSE: [0.0599 0.0788]
       CombineWeights: @classreg.learning.combiner.WeightedSum

Check how many learners in the regularized ensemble have positive weights. These are the learners included in a shrunken ensemble.

sum(bag.Regularization.TrainedWeights > 0)
ans = 

   108    87

Shrink the ensemble using the weights from Lambda = 0.1.

cmp = shrink(bag,'weightcolumn',2)
cmp = 
  classreg.learning.regr.CompactRegressionEnsemble
             ResponseName: 'Y'
    CategoricalPredictors: []
        ResponseTransform: 'none'
               NumTrained: 87


  Properties, Methods

The compact ensemble contains 87 members, less than 1/3 of the original 300.

Definitions

expand all

Was this topic helpful?