Optimal (linear) combination of several binary probabilistic classifiers

1 view (last 30 days)
Hello together,
Based on an EEG training set, I have trained several binary probabilistic classifiers (logistic regression) that all try to predict in an independent EEG test set whether a person currently things about a movement (class 1) or not (class 2). All classifiers work above chance level, however they are all far from being perfect. I’m now wondering whether it is somehow possible to optimally combine my classifier’s outputs, in order to obtain only one probability value per trial, however that is more reliable than my individual classifier outputs? My intuitive feeling is that the solution to my problem has something to do with “ensemble methods” (https://en.wikipedia.org/wiki/Ensemble_learning), however, unfortunately I am a bit overwhelmed with all the methods and algorithms being available (and unfortunately my knowledge about machine learning is quite limited).
Could perhaps anyone just give me some advice what method would be most suitable for my purpose (optimal linear combination of different binary probabilistic classifiers) and how I can easily implement it. I’m wondering whether a simple linear regression model would perhaps already do the job for me (however would be probably not the best solution, right)?
I would be very very thankful for any help. Cheers

Answers (0)

Categories

Find more on EEG/MEG/ECoG in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!