Code covered by the BSD License  

Highlights from
Classic AdaBoost Classifier

4.55556

4.6 | 9 ratings Rate this file 220 Downloads (last 30 days) File Size: 4.07 KB File ID: #27813
image thumbnail

Classic AdaBoost Classifier

by

 

01 Jun 2010 (Updated )

Weak threshold classifier boosted to strong Classifier with Adaboost

| Watch this File

File Information
Description

This a classic AdaBoost implementation, in one single file with easy understandable code.

The function consist of two parts a simple weak classifier and a boosting part:
The weak classifier tries to find the best threshold in one of the data dimensions to separate the data into two classes -1 and 1
The boosting part calls the classifier iteratively, after every classification step it changes the weights of miss-classified examples. This creates a cascade of "weak classifiers" which behaves like a "strong classifier"
.
Training mode:
  [estimateclass,model]=adaboost('train',datafeatures,dataclass,itt)
Apply mode:
 estimateclass=adaboost('apply',datafeatures,model)
 
inputs/outputs:
   datafeatures : An Array with size number_samples x number_features
   dataclass : An array with the class off all examples, the class
                 can be -1 or 1
   itt : The number of training iterations
   model : A struct with the cascade of weak-classifiers
   estimateclass : The by the adaboost model classified data
            
.
Please leave a comment, if you like the code, find a bug or have a suggestion.

MATLAB release MATLAB 7.10 (R2010a)
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (19)
10 Jul 2014 KimHan

in adaboost.m, there is a line with "...wrongly classified samples will have more weight", but this weight D is never used in the code.

17 Jun 2013 bodhitree chen

Hi Dirk-Jan Kroon,

In file ADABOOST_tr.m, there is a line as follow:

% The weight of the turn-th weak classifier
adaboost_model.weights(turn) =
log10((1-error_rate)/error_rate);

The weight should be changed to

adaboost_model.weights(turn) =
log((1-error_rate)/error_rate);

i.e. not base 10 logarithm but natural logarithm.

05 May 2013 Nikita

I get an OutOfMemory error, when Matlab is passing cumsum() function in

adaboost>WeightedThresholdClassifier (line 126)

while i have 8GB of RAM availble and Training set contains only 6,5 milions entries of double. Is it possible to replace the cumsum() function with sum() for example?

27 Mar 2013 Zheng jingjing

Hello Dirk-Jan Kroon,
Can you please let me know how can I use SVM as weak classifier? .Can you tell me how to put the Adaboost algorithm connect with the SVM ?
Thanks

25 Feb 2013 lu li

Thank you! It's very helpful and instructive.

06 Nov 2012 mehregan

Hi Dirk-Jan Kroon, please can you help me
THIS CODE is great
and I need a paper to explain this code beacuse I do not underestand some parts of this code.
Thanks a lot
mehrzad

14 Aug 2012 Nooshin

Can you please let me know how can I use SVM as weak classifier?
Thanks

13 Aug 2012 Nooshin

Hi Dirk
Thanks for your code.
I had a question. For each subject I have a feature vector including 144 features. Class labels are 0 and 1. The number of subjects are 20. I used your code and accuracy was so low, while I used SVM's accuracy is high. What is its reason? Do you think too many features can affect the accuracy of adaboost? please let me know. Thanks a lot

09 Jul 2012 Fa Fa

Hi Dirk-Jan Kroon, please can you help me,
i have 1000 faceimages and 1000 background and i have a histogram of each image, i have also 512 lookup table from 000000000 to 111111111 integer feature.i created the feature of the trainingimages with census transform.how can i save for every lookup table a value with adaboost?if the value of lookup table prefer to be face or non-face.
thanks

16 Feb 2012 Andreas

Hi Dirk,
Are there any requirements on the features (such as no NaNs? numerical scaling? equal numbers in each class?). The internal variable p2c at line 117 has runs of large and NaN values, causing accumarray and the WeightedThresholdClassifier function to crash. Thanks - Andreas

10 Dec 2011 Muhammad Hamid

Hi!
Can i use this code instead of Haar feature based AdaBoost cascade classifiers?
Actually i am not understanding the concept behind Adaboost and its very much compulsory for me to implement it in just 10 days.If i have to use the Adaboost cascade classifiers on the basis of the brake lights and taillights of the vehicles on the roads in evening hours then what will be parameters of this function?waiting for reply...please help

02 Dec 2011 Su Dongcai

Neatly codes, very nice demonstration

10 Jul 2011 jinkyu do  
22 Mar 2011 sudheesh p

i am not able to run the adaboost m-file, it shows some eeror at the statement switch (mode)asking about error at mode and in the example m-file only the training pattern is visible

13 Mar 2011 CarloG  
17 Jan 2011 AMVR

Great contribution, Dirk-Jan.

This is just a suggestion, but have you thought about adding probability calibration to the output of the classifier? e.g. logistic correction, platt scaling or isotonic regression (see "Obtaining Calibrated Probabilities from Boosting", by Alexandru Niculescu-Mizil and Rich Caruana for more context).

23 Nov 2010 h gd

Easy to understand

09 Oct 2010 David

Xu Cui code?

26 Sep 2010 Ruchir Srivastava

Hi Dirk-Jan Kroon,

Thank you for the nice comments making the code easily understandable.

Can I do feature selection using your code? I am new at Adaboost so am not able to figure out how can this be done. Can you please help me?

Updates
01 Jun 2010

Changed Screenshot and example figure

30 Aug 2010

Solved division by zero, causing NaN

29 Dec 2010

Changed bug : ndims(datafeatures)to size(datafeatures,2)

07 Oct 2011

Speed improvement (Replaced loops by 1D indexing and bsxfun operations.)
The function now limits features of the test data to the outer-boundaries of training data.

20 Jan 2012

Fixed boundary bug

Contact us