Optimization and Machine Learning

4 views (last 30 days)
Greg
Greg on 10 Dec 2014
Commented: Greg on 10 Dec 2014
If you're able to create a machine learning classification algorithm (such as a boosted Ensemble Learning techniques for regression or classification) to create a model, is it then possible to optimize the predicted response around that model?
For instance, say I create a classification model from 1000 examples (rows) and 70 features (columns) to predict a binary classification response. It's simple to then manually create a hypothetical 1001st example and predict the class to which it will belong.
I would like to be able to define & fix some of those 70 features, (let's say 5) while allowing others to fluctuate. Is there a way to do this, and then allow an optimization algorithm to optimize the remaining 65 features, such that I get the optimal combination of features to maximize the likelihood of achieving a given classification?
On the surface, it seems like the Optimization Toolbox would provide this functionality, but I don't know if its possible to define a machine learning model in the optimization toolbox.
Thanks.

Answers (1)

Sean de Wolski
Sean de Wolski on 10 Dec 2014
Sequential feature selection is what it sounds like you're looking for.
doc sequentialfs
  1 Comment
Greg
Greg on 10 Dec 2014
I don't think so. Sequential feature selection looks to be a way to minimize the number of variables required to achieve optimal predictive capability of a model. I'm looking for a way to substitute values for features back into a model to generate the optimal combination of feature values that yields a target response.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!