File Exchange

image thumbnail

Flexible Bayesian penalized regression modelling

version 1.9.1.0 (105 KB) by Statovic
Bayesian lasso, horseshoe and horseshoe+ linear, logistic regression and count regression

14 Downloads

Updated 30 Nov 2020

View Version History

View License

This is a comprehensive, user-friendly toolbox implementing the state-of-the-art in Bayesian linear regression, logistic and count regression. The toolbox provides highly efficient and numerically stable implementations of ridge, lasso, horseshoe, horseshoe+, log-t and g-prior regression. The lasso, horseshoe, horseshoe+ and log-t priors are recommended for data sets where the number of predictors is greater than the sample size, and the log-t prior provides adaptation to unknown levels of sparsity. The toolbox allows predictors to be assigned to logical groupings (potentially overlapping, so that predictors can be part of multiple groups). This can be used to exploit a priori knowledge regarding predictors and how they may be related to each other (for example, in grouping genetic data into genes and collections of genes such as pathways).

Count regression is now supported through implementation of Poisson and geometric regression models. To support analysis of data with outliers, we provide two heavy-tailed error models in our implementation of Bayesian linear regression: Laplace and Student-t distribution errors. Most features are straightforward to use and the toolbox can work directly with MATLAB tables (including automatically handling categorical variables), or you can use standard MATLAB matrices.

The toolbox is very efficient and can be used with high-dimensional data. Please see the scripts in the directory "examples\" for examples on how to use the toolbox, or type "help bayesreg" within MATLAB. An R version of this toolbox is now available on CRAN. To install the R package, type "install.packages("bayesreg")" within R.

To cite this toolbox:
Makalic E. & Schmidt, D. F.
High-Dimensional Bayesian Regularised Regression with the BayesReg Package
arXiv:1611.06649 [stat.CO], 2016

UPDATE VERSION 1.9.1 (30/11/2020):
Latest updates:
-Fix count regression for Matlab 2020a and 2020b releases.

PLEASE NOTE:
The package now handles logistic regression without the need for MEX files, but big speed-ups can be obtained when using compiled code, so this is recommended. To compile the C++ code, run compile.m from the bayesreg directory within MATLAB; compilation requires the MS Visual Studio Professional or the GNU g++ compiler. Alternatively, for convenience, the pre-compiled MEX files (MATLAB R2017a) for Windows, Linux and Mac OSX can be downloaded from the following URL:

http://www.emakalic.org/blog/

To use these, all you need to do is download them and unzip into the "bayesreg" folder.

Cite As

Enes Makalic and Daniel F. Schmidt (2016). High-Dimensional Bayesian Regularised Regression with the BayesReg Package, arXiv:1611.06649 [stat.CO]

Daniel F. Schmidt and Enes Makalic (2020). Log-Scale Shrinkage Priors and Adaptive Bayesian Global-Local Shrinkage Estimation, arXiv:1801.02321 [math.ST]

Daniel F. Schmidt and Enes Makalic (2019). Bayesian Generalized Horseshoe Estimation of Generalized Linear Models. ECML PKDD 2019: Machine Learning and Knowledge Discovery in Databases. pp 598-613

Comments and Ratings (16)

SangSup Cho

Very useful file for my teaching.

Statovic

Hi SangSup,

After running br_example3 the posterior samples are stored in the matrix "beta". You could produce the box and whisker plots by running, for example,
boxplot(beta');

Additionally, to get the conditional posterior PDFs, you could try something like:
tiledlayout(2,4);
for i = 1:8, nexttile;
ksdensity(beta(i,:)');
grid; xlabel(['\beta_', num2str(i)]);
end;

I hope this helps.

SangSup Cho

This files is a great work. Would you give a boxplot of the posterior samples of the regression coefficients and the estimated
conditional posterior probability density functions in example 3 ?

Follow up to the previous comment. I have changed the sampler in bayesreg for poisson models. In line 272, I have changed
MH = true;
to
SMN = true;
Hope this does the job. It seems to be right as far as I can judge from the output.
Best, Wolfgang

Hi, this is a great tool. However, when using bayesreg with a Poisson model, I receive following error:

Error using glmfit (line 187)
Binomial response variable must be a vector or a matrix with two columns of non-negative
integers.

Error in mh_Tune (line 91)
tune.b_tune = glmfit(tune.delta_window(1:tune.W_burnin), [tune.m_window(1:tune.W_burnin),
tune.n_window(1:tune.W_burnin)], 'binomial');

Error in bayesreg (line 1128)
mh_tuning = mh_Tune(mh_tuning);

Wonder what I am doing wrong here (using MATLAB 2020b).
Best regards, Wolfgang

Shida Gao

Xia Qi

Daniela Marchettini

Yashvir Singh Grewal

Yang Wang

Fantastic job. Thanks guys!

Mike Wong

Steven

Great program. Very useful for my work.

Statovic

Hi Gary,

We have just finished and uploaded Version 1.3 of the software. This has support for MATLAB tables, handles categorical variables appropriately and has a prediction function that can be used to produce predictions, prediction credible intervals and calculate prediction performance statistics. I hope you find it useful.

Cheers,
Daniel

GARY CHAO

Hi Daniel,
Thanks so much for your answer!
I have implemented the toolbox in MATLAB and found the results correspond well with the traditional ridge/lasso regularization results when p<n, meaning the situations when the number of predictors is smaller than the number of observations.
Hope that the new version comes soon and better!
Thanks and best regards,
Gary

Statovic

Hi Gary.

The fully Bayesian approach used in this tool selects the regularisation parameters automatically by including it in the Bayesian hierarchy and sampling along with the model parameters. The current version implements a "half-Cauchy" prior on the overall regularisation parameter, in accordance with suggestions from Polson and others.

The "best" posterior regression coefficients, in terms of squared-prediction error, are given by retval.muB. We are just finishing a version which provides a "predict" function to compute predictions onto new data (or the training data, if you want) and calculates prediction performance statistics. It also allows you to predict using the full Bayesian predictive posterior distribution, accepts Matlab tables and handles categorical variables. Hopefully this will be released in the next few days.

Cheers and thanks for your interest,
Daniel

GARY CHAO

Hi, thanks for your contribution to Bayesian regularization problem!
I have a question on this toolbox:
As we often choose the proper coefficients for the regularization term to obtain the most suitable prediction results through cross-validation when using Ridge/Lasso regularization method, is there a similar process in this toolbox? For example, how can we set the values for the error distributions in this toolbox to obtain different prior distributions? I have this question since I found that the results using this toolbox are different from the results obtained directly by ridge/lasso function in matlab.
By the way, how can we obtain the "best" posterior regression coefficients? Can we regard the results in retval.muB as the "best" results?
Thanks in advance for your help!

MATLAB Release Compatibility
Created with R2016a
Compatible with any release
Platform Compatibility
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!