File Exchange

image thumbnail


version (6.73 KB) by Cuneyt Mertayak
AdaBoost: The meta machine learning algorithm formulated by Yoav Freund and Robert Schapire


Updated 03 Sep 2008

No License

AdaBoost, Adaptive Boosting, is a well-known meta machine learning algorithm that was proposed by Yoav Freund and Robert Schapire. In this project there two main files
1. ADABOOST_tr.m
2. ADABOOST_te.m
to traing and test a user-coded learning (classification) algorithm with AdaBoost. A demo file (demo.m) is provided that demonstrates how these two files can be used with a classifier (basic threshold classifier) for two class classification problem.

Cite As

Cuneyt Mertayak (2021). AdaBoost (, MATLAB Central File Exchange. Retrieved .

Comments and Ratings (21)

jacky chen

when calculate the weight of the turn-th weak classifier:
adaboost_model.weights(turn) = log10((1-error_rate)/error_rate);
the weight value 0.5 is lost , it seems better:
adaboost_model.weights(turn) = 0.5*log10((1-error_rate)/error_rate);

Jaroslaw Tuszynski

Works correctly but slowly


Could somebody explain how exactly weights of samples are used?
For example for training sample x1=[a1,.., an]. How the x1 will change after applying the initial weight of 1/m? Is the x1 going to change to [a1/m,...,an/m]?


I'd like to see the algorithm





sani ars

im using this adaboost toolbox. I hv to use the SVM as weak base classsifier. but have no idea how to use it or in what format i'll write the SVM so that I'll be able to use it in adaboost function..?? Is there anybody to help me out??.. just need to know in what format I would write SVM for adaboost


I am using your toolbox for boosting SVM as the weak classifier. But I found out that the class likelihoods are needed for your method, but all I have for SVM output are the predicted labels, is there any way to solve this?

weaklearners performing less than 50% is not an issue actually. Problem comes when there are performing exactly or near to 50% : it is like forming a good commitee with rolling dices then

Xinzhu Wang



Mohammad Ali Bagheri

Actually I wanna know how it work with multi-class problems?


I will learn it.

mila amel

I couldn't see how these two files can be used adaboost_te.m and adaboost_tr.m?

Sidath Liyanage

Does anyone have a code for Adaboost M2 for multiclass classification with weaklearners performing less than 50%?


Is there anybody working on the Adaboost R2 for regeression?


I think you mean GML Adaboost (>one from Russia) right?
Actully, I was trying to use it for couple of days without any success.
I see ... this is the case... the number of samples I use are less than the number of dimensions.
Could you let us know If you have any idea to update the code to deal with this case.

Ghahramani Mohammad

I recommend the one from Russia, but than one has error too. it works for the data with Dimension smaller than Number of samples the code must be changed from length to size because in some cases such as face recognition D is larger than N.

Vimal Vaghela

Still the Algorithm is not up to the mark. its really old one. too much imrproved versions are there.

Dimitri Shvorob

The formula does not look familiar; the reference, which you have to look for, does not either. Existence of several versions of the algorithm seems to be lost on the author. I would recommend Googling a well-documented 'Adaboost toolbox' by a guy from Moscow State University.

Amit Ganatra


MATLAB Release Compatibility
Created with R14SP1
Compatible with any release
Platform Compatibility
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!