View License

Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.

» Watch video

Highlights from
Classic AdaBoost Classifier

4.6 | 13 ratings Rate this file 86 Downloads (last 30 days) File Size: 4.07 KB File ID: #27813 Version: 1.5
image thumbnail

Classic AdaBoost Classifier


Dirk-Jan Kroon (view profile)


01 Jun 2010 (Updated )

Weak threshold classifier boosted to strong Classifier with Adaboost

| Watch this File

File Information

This a classic AdaBoost implementation, in one single file with easy understandable code.

The function consist of two parts a simple weak classifier and a boosting part:
The weak classifier tries to find the best threshold in one of the data dimensions to separate the data into two classes -1 and 1
The boosting part calls the classifier iteratively, after every classification step it changes the weights of miss-classified examples. This creates a cascade of "weak classifiers" which behaves like a "strong classifier"
Training mode:
Apply mode:
   datafeatures : An Array with size number_samples x number_features
   dataclass : An array with the class off all examples, the class
                 can be -1 or 1
   itt : The number of training iterations
   model : A struct with the cascade of weak-classifiers
   estimateclass : The by the adaboost model classified data
Please leave a comment, if you like the code, find a bug or have a suggestion.

MATLAB release MATLAB 7.10 (R2010a)
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (26)
04 Feb 2017 anil hazarika


Comment only
01 Nov 2016 fafz1203

01 Oct 2016 jayapriya p

good work .... can u provide us code for integral image

27 Oct 2015 Anuja Kelkar

Error using accumarray
First input SUBS must contain positive integer subscripts.

Error in adaboost>WeightedThresholdClassifier (line 125)
h1f=accumarray([p1f(:) i1(:)],repmat(w1(:),ndims,1),[ntre ndims],[],0);

Error in adaboost (line 50)
[estimateclass,err,h] =

Please help! I don't understand what's wrong!

Comment only
16 Sep 2015 Furkan Gurpinar

21 Feb 2015 BUSI HANUSRI

good work..can you provide us a code for adaboost estimator in ofdm

Comment only
03 Dec 2014 ankit dixit

ankit dixit (view profile)

Nice work dude, keep it up :-)

10 Jul 2014 KimHan

KimHan (view profile)

in adaboost.m, there is a line with "...wrongly classified samples will have more weight", but this weight D is never used in the code.

Comment only
17 Jun 2013 bodhitree chen

Hi Dirk-Jan Kroon,

In file ADABOOST_tr.m, there is a line as follow:

% The weight of the turn-th weak classifier
adaboost_model.weights(turn) =

The weight should be changed to

adaboost_model.weights(turn) =

i.e. not base 10 logarithm but natural logarithm.

Comment only
05 May 2013 Nikita

Nikita (view profile)

I get an OutOfMemory error, when Matlab is passing cumsum() function in

adaboost>WeightedThresholdClassifier (line 126)

while i have 8GB of RAM availble and Training set contains only 6,5 milions entries of double. Is it possible to replace the cumsum() function with sum() for example?

27 Mar 2013 Zheng jingjing

Hello Dirk-Jan Kroon,
Can you please let me know how can I use SVM as weak classifier? .Can you tell me how to put the Adaboost algorithm connect with the SVM ?

Comment only
25 Feb 2013 lu li

lu li (view profile)

Thank you! It's very helpful and instructive.

06 Nov 2012 mehregan

Hi Dirk-Jan Kroon, please can you help me
THIS CODE is great
and I need a paper to explain this code beacuse I do not underestand some parts of this code.
Thanks a lot

14 Aug 2012 Nooshin

Can you please let me know how can I use SVM as weak classifier?

Comment only
13 Aug 2012 Nooshin

Hi Dirk
Thanks for your code.
I had a question. For each subject I have a feature vector including 144 features. Class labels are 0 and 1. The number of subjects are 20. I used your code and accuracy was so low, while I used SVM's accuracy is high. What is its reason? Do you think too many features can affect the accuracy of adaboost? please let me know. Thanks a lot

09 Jul 2012 Fa Fa

Fa Fa (view profile)

Hi Dirk-Jan Kroon, please can you help me,
i have 1000 faceimages and 1000 background and i have a histogram of each image, i have also 512 lookup table from 000000000 to 111111111 integer feature.i created the feature of the trainingimages with census can i save for every lookup table a value with adaboost?if the value of lookup table prefer to be face or non-face.

Comment only
16 Feb 2012 Andreas

Hi Dirk,
Are there any requirements on the features (such as no NaNs? numerical scaling? equal numbers in each class?). The internal variable p2c at line 117 has runs of large and NaN values, causing accumarray and the WeightedThresholdClassifier function to crash. Thanks - Andreas

Comment only
10 Dec 2011 Muhammad Hamid

Can i use this code instead of Haar feature based AdaBoost cascade classifiers?
Actually i am not understanding the concept behind Adaboost and its very much compulsory for me to implement it in just 10 days.If i have to use the Adaboost cascade classifiers on the basis of the brake lights and taillights of the vehicles on the roads in evening hours then what will be parameters of this function?waiting for reply...please help

Comment only
02 Dec 2011 Su Dongcai

Su Dongcai (view profile)

Neatly codes, very nice demonstration

10 Jul 2011 jinkyu do

22 Mar 2011 sudheesh p

i am not able to run the adaboost m-file, it shows some eeror at the statement switch (mode)asking about error at mode and in the example m-file only the training pattern is visible

Comment only
13 Mar 2011 CarloG

CarloG (view profile)

17 Jan 2011 AMVR

AMVR (view profile)

Great contribution, Dirk-Jan.

This is just a suggestion, but have you thought about adding probability calibration to the output of the classifier? e.g. logistic correction, platt scaling or isotonic regression (see "Obtaining Calibrated Probabilities from Boosting", by Alexandru Niculescu-Mizil and Rich Caruana for more context).

23 Nov 2010 h gd

h gd (view profile)

Easy to understand

09 Oct 2010 David

David (view profile)

Xu Cui code?

Comment only
26 Sep 2010 Ruchir Srivastava

Hi Dirk-Jan Kroon,

Thank you for the nice comments making the code easily understandable.

Can I do feature selection using your code? I am new at Adaboost so am not able to figure out how can this be done. Can you please help me?

Comment only
01 Jun 2010 1.1

Changed Screenshot and example figure

30 Aug 2010 1.2

Solved division by zero, causing NaN

29 Dec 2010 1.3

Changed bug : ndims(datafeatures)to size(datafeatures,2)

07 Oct 2011 1.4

Speed improvement (Replaced loops by 1D indexing and bsxfun operations.)
The function now limits features of the test data to the outer-boundaries of training data.

20 Jan 2012 1.5

Fixed boundary bug

Contact us