File Exchange

image thumbnail

Fit GLM with quadratic penalty

version 1.4 (14 KB) by

Fits GLM with a quadratic penalty, determines hyperparams through cross-validation or evidence

9 Downloads

Updated

View License

Generalized linear models (GLMs) are a natural extension of linear regression models in which eta = X*w is related to y by a fixed nonlinearity and a possibly non-Gaussian noise source. Standard linear regression, logistic regression and Poisson regression are all special types of GLMs.

This package fits GLMs with quadratic penalties. That is, if the negative log likelihood of the data with respoect to the model parameters is given by -log(p(y|w)), then glmfitqp solves the problem:

min_w (-log(p(y|w)) + .5*w'*qf'w)

This form of penalty naturally arises by assuming a prior on w, p(w) = N(0,qf^-1). Quadratic penalties can be used to impose that the weights are small (qf = lambda*I) or that the weights are smooth (qf = lambda*D).

In general qf is only known up to a multiplicative constant lambda that determines the strength of the regularization and must be determined empirically. The function cvglmfitqp finds this optimal lambda through k-fold cross-validation. The cross-validation can be parallelized through parfor (requires parallel computing toolbox).

It is also possible to consider a more general prior of the form:

-log p(w) = .5*w'*(qf0 + sum_i lambda(i) qfs(:,:,i) )

In this case evidenceglmfitqp can be used to determine the optimal set of lambdas through evidence (marginal likelihood) maximization.

Example use:

---

%%
%Figure out optimal strength of prior through cross validation
%Assume smoothness of the model parameters
qf = blkdiag(qfsmooth1D(16),.01);
rg = (-7.5:7.5)';

%Simulate a model with w = Gabor function
w = exp(-rg.^2/3^2).*sin(rg*2*pi/6);
nobs = 150;
X = [randn(nobs,length(w)),ones(nobs,1)];
r = 3*X*[w;.01];

%output is binary -> logistic regression
r = binornd(1,1./(1+exp(-r)));

%Set up 5-fold CV
folds = getcvfolds(length(r),5,1001);

%Fit the data
clear opts
opts.family = 'binomlogit';
opts.lambda0 = 1;
results = cvglmfitqp(r,X,qf,folds,opts);

plot(results.w(1:end-1))

Comments and Ratings (7)

Patrick Mineault

Either Generalized Linear Models by McCullough and Nelder or Generalized Additive Models: An introduction with R by Simon Wood.

Marc

Marc (view profile)

Hi Patrick, this is a great package!

Are you able to suggest a couple of references that are the basis for this penalized GLM approach and for your particular implementation?

Thanks.

Marc

Marc (view profile)

Patrick Mineault

Sure, if you use X*results.w this will give you eta, the linear predictor. Then pass this through your inverse link, say the exponential if you use the Poisson distribution. That's your prediction.

Jeff

Jeff (view profile)

Hi, could you explain how to get a predicted Y variable from the fit struct that is returned? I went through the various objects in the struct and it was not clear at all how to just get the model output. I am particularly interested in the case with real valued inputs for the X variable. Thanks, and I hope to be able to use this.

Jakob Voigts

Jakob Voigts

Not directly related to this toolbox, but might help others with the same issue:

got ??? Reference to non-existent field 'Preconditioner'. errors when using poisson dists. Problem seems to be in the way optimget handles default values for non specified parameters.
Fixed it for now by putting opts.ActiveConstrTol=[]; and opts.Preconditioner=[]; in the irls() function.

Updates

1.4

Support for evidence-based optimization of hyperparameters

1.3

Support for Hessian-based optimization, weighted data points, enhanced cross-validation proposals

1.2

parfor supported during cross-validation, better scheme for determining initial lambda

1.1

Bug fixes and more sophisticated cvglmfitqp

MATLAB Release
MATLAB 7.8 (R2009a)

Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.

» Watch video