This is a self-contained package for running feature selection filters: Given a (usually large) number of noisy and partly redundant variables and a target choose a small but indicative subset as input to a classification or regression technique.
For background information, see e.g: Gavin Brown, 'A New Perspective for Information Theoretic Feature Selection', Artificial Intelligence and Statistics, 2009.
The Matlab function select_features.m includes several previously published methods as special cases, such as FOU, MRMR, MIFS-U, JMI, and CMIM. It allows for higher-order interaction terms, forward and backward search, priors, several redundancy weighting options, and pessimistic estimates.
Auxiliary functions for discretization, construction and marginalization of probability tables, conditional entropy, mutual information and interaction information are included and are usable by themselves. See demo_feature_select.m for examples.