File Exchange

image thumbnail

Boosted Binary Regression Trees

version (5.97 KB) by Kota Hara
Boosted Binary Regression Trees is a powerful regression method which can handle vector targets.


Updated 12 Jul 2016

View Version History

View License

Boosted Binary Regression Trees (BBRT) is a powerful regression method proposed in [1]. BBRT combines binary regression trees [3] using a gradient boosting technique.
There are several variants proposed in [1]. In [1], it is assumed that the target is a scalar value. However, it is trivial to extend the method to vector target cases by proper modifications.
This code is based on "LS_Boost" described in [1] but it can also handle vector target cases. In other words, you do not need to train an independent regressor for each target dimension, unlike Support Vector Regression.
The detail of the algorithm this code implements can be found in [2].
[1] J. H. Friedman. Greedy Function Approximation: a Gradient Boosting Machine. Annals of Statistics, 2001.
[2] Kota Hara and Rama Chellappa, Computationally Efficient Regression on a Dependency Graph for Human Pose Estimation, CVPR 2013.
[3] Breiman, Leo; Friedman, J. H., Olshen, R. A., & Stone, C. J. (1984). Classification and regression trees. Monterey, CA: Wadsworth & Brooks/Cole Advanced Books & Software. ISBN 978-0-412-04841-8.

Cite As

Kota Hara (2021). Boosted Binary Regression Trees (, MATLAB Central File Exchange. Retrieved .

Comments and Ratings (14)

smoothie tn

can you please give an exemple of application of these functions (in one code)? thanks

yuan shuaibi

Lai Wei

Hello, how can I get the importance of each element?Thank you


Duo Li

Hassan mohamed

Hello, I tried the code but it gives me only one number as an output without varargin sentence or error message of : Attempts to execute varargin.m file to bath
c:\ program files \ MATALB\ .....\lang\varargin.m
please give advice.

Tom Clark

Remove the include of tchar.h. It contains no dependencies and prevents compilation on non-windows platforms.


I released the code for new regression tree forests method described in "Growing Regression Forests by Classification: Applications to Object Pose Estimation. The European Conference on Computer Vision (ECCV), 2014." The method outperforms various regression methods such as traditional binary regression tree forests as well as Support Vector Regression and Kernel Partial Least Squares Regression. The code is available from here. It is written in matlab.


Hi, would you please add some date for code running? ths

Kota Hara

Hi Hossein,

I think the implementation is correct, though it's not the most efficient. The update procedures in this code can be proved to be equivalent to those proposed in "Updating Mean and Variance Estimates: An Improved Method." You can also find it in wikipedia. (

The update procedures in the above method are:

SSE_{N} = SSE_{N-1} + (x_N - \bar x_{N-1})(x_N - \bar x_N)

The second term of RHS is

(x_N - \bar x_N + \bar x_N - \bar x_{N-1} ) ( x_N - \bar x_N ) = ( x_N - \bar x_N )^2 + (x_N - \bar x_N )(\bar x_N - \bar x_{N-1} )

The procedure in my code is
SSE_{N} = SSE_{N-1} + (\bar x_N - \bar x_{N-1})^2 * (N-1) + ( x_N - \bar x_N )^2

So now we want to show
(x_N - \bar x_N )(\bar x_N - \bar x_{N-1} ) = (\bar x_N - \bar x_{N-1})^2 * (N-1)

First divide both side by (\bar x_N - \bar x_{N-1}). Now we need to show (\bar x_N - \bar x_{N-1})(N-1) = x_N - \bar x_N

We can show this using the update procedure for mean (i.e., \bar x_N = \bar x_{N-1} + (x_N - \bar x_{N-1}) / N

The proof for sseRight should be done similarly.

As I said, this is not as efficient as the original procedure, so I will consider modifying the code.



Hi Kota,

Are you sure that the implementation of findBestSplit is correct? I have doubts about the SSE update. I mean the lines 129 and 130

for( int k=0; k<targetDim; k++ ){
sseLeft = sseLeft + ( aveLeft[k] - aveLeftPre[k] ) * ( aveLeft[k] - aveLeftPre[k] ) * ( sizeLeft - 1 ) + ( target(sortByValue[j].idx,k) - aveLeft[k] ) * ( target(sortByValue[j].idx,k) - aveLeft[k] );
sseRight = sseRight - ( aveRightPre[k] - aveRight[k] ) * ( aveRightPre[k] - aveRight[k] ) * sizeRight - ( target(sortByValue[j].idx,k) - aveRightPre[k] ) * ( target(sortByValue[j].idx,k) - aveRightPre[k] );

I understand the procedure, but I think the update rule is not correct.



I could compile on unix by removing #include <tchar.h> and 2 minor changes.


How can I compile the cpp file in unix? Thank you.

Kota Hara

Please feel free to drop a comment if you have suggestions, find bugs or anything.

MATLAB Release Compatibility
Created with R2014a
Compatible with any release
Platform Compatibility
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!