Code covered by the BSD License  

Highlights from
Feature selector based on genetic algorithms and information theory.


4.0 | 5 ratings Rate this file 57 Downloads (last 30 days) File Size: 3.16 KB File ID: #29553

Feature selector based on genetic algorithms and information theory.



27 Nov 2010 (Updated )

The algorithm performs the combinatorial optimization by using Genetic Algorithms.

| Watch this File

File Information

Techniques from information theory are usual in selecting variables in time series prediction or pattern recognition. These tasks involve, directly or indirectly, the maximization of the mutual information between input and output data. However, this procedure requires a high computational effort, due to the calculation of the joint entropy, which requires the estimation of the joint probability distributions. To avoid this computational effort, it is possible to apply variable selection based on the principle of minimum-redundancy/maximum-relevance, which maximizes the mutual information indirectly, with lower computational cost. However, the problem of combinatorial optimization, i.e. to check all possible combinations of variables, still represents a large computational effort. Due to this computational cost, a simple method of incremental search, that reaches a quasi-optimal solution, was proposed by some previous works. Given the limitations of the existing methods, this code was developed, in order to perform the combinatorial optimization by using Genetic Algorithms. The arguments are the desired number of selected features (feat_numb), a matrix X, in which each column is a feature vector example, and its respective target data y, which is a row vector. The output is a vector with the indexes of the features that composes the optimum feature set, in which the order of features has NO relation with their importance. In case of publication, please cite the original work: O. Ludwig and U. Nunes; “ Novel Maximum-Margin Training Algorithms for Supervised Neural Networks;” IEEE Transactions on Neural Networks, vol.21, issue 6, pp. 972-984, Jun. 2010, where this algorithm is applied in choosing the hidden neurons to compose a hybrid neural network named ASNN.

MATLAB release MATLAB 7 (R14)
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (10)
08 Jun 2014 Dmitry Kaplan

Can you please explain the meaning of the resolucao=15. Why 15?

16 Jun 2012 Nermine

if u please send a description for the used GA method,thnx

31 May 2012 Alsam  
31 May 2012 Alsam  
21 May 2012 sam

Ali, use transposed X n y, i.e. X',y'.. hope it helps

01 Apr 2012 ali Abusnina


I am facing difficulty using the code. I am running Matlab 7, on Mac OS. When i call the function I get the following error :

??? Error using ==> vertcat
CAT arguments dimensions are not consistent.
Error in ==> statistics at 9
Error in ==> GA_feature_selector at 19

Can anyone help please


05 Feb 2012 rekoba

thanks for your code
but plz write an example to run this code

08 Jul 2011 Oswaldo Ludwig

Dear Mohamed,

The approach depends on your application, in the case of object detection, the usual approach is to use an image descriptor, eg HOG (see, before the feature selection.

08 Jul 2011 Peer Mohamed

How to use this function for images

24 Mar 2011 CarloG  
11 Sep 2012

Only the description.

Contact us