Code covered by the BSD License  

Highlights from
Feature selector based on genetic algorithms and information theory.

3.66667
3.7 | 7 ratings Rate this file 59 Downloads (last 30 days) File Size: 3.16 KB File ID: #29553

Feature selector based on genetic algorithms and information theory.

by

Oswaldo Ludwig (view profile)

 

27 Nov 2010 (Updated )

The algorithm performs the combinatorial optimization by using Genetic Algorithms.

| Watch this File

File Information
Description

Techniques from information theory are usual in selecting variables in time series prediction or pattern recognition. These tasks involve, directly or indirectly, the maximization of the mutual information between input and output data. However, this procedure requires a high computational effort, due to the calculation of the joint entropy, which requires the estimation of the joint probability distributions. To avoid this computational effort, it is possible to apply variable selection based on the principle of minimum-redundancy/maximum-relevance, which maximizes the mutual information indirectly, with lower computational cost. However, the problem of combinatorial optimization, i.e. to check all possible combinations of variables, still represents a large computational effort. Due to this computational cost, a simple method of incremental search, that reaches a quasi-optimal solution, was proposed by some previous works. Given the limitations of the existing methods, this code was developed, in order to perform the combinatorial optimization by using Genetic Algorithms. The arguments are the desired number of selected features (feat_numb), a matrix X, in which each column is a feature vector example, and its respective target data y, which is a row vector. The output is a vector with the indexes of the features that composes the optimum feature set, in which the order of features has NO relation with their importance. In case of publication, please cite the original work: O. Ludwig and U. Nunes; “ Novel Maximum-Margin Training Algorithms for Supervised Neural Networks;” IEEE Transactions on Neural Networks, vol.21, issue 6, pp. 972-984, Jun. 2010, where this algorithm is applied in choosing the hidden neurons to compose a hybrid neural network named ASNN.

MATLAB release MATLAB 7 (R14)
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (14)
14 Nov 2014 Oswaldo Ludwig

Oswaldo Ludwig (view profile)

Dear Arshi,

Pressão means pressure in my mother tongue, you can set the selective pressure of the GA through this variable, see Equation (10) of:
https://www.researchgate.net/publication/235687343_Improving_the_Generalization_Capacity_of_Cascade_Classifiers

Comment only
14 Nov 2014 arshi

arshi (view profile)

Dear Oswaldo,

Can you please explain the meaning of 'Pressao'......There are no of variables whoz meaning is difficult to be guessed....So can you please provide an algo of the code.

24 Sep 2014 Oswaldo Ludwig

Oswaldo Ludwig (view profile)

Mahyar,

I'm sorry you aren't able to read/interpret the file description: "... The arguments are the desired number of selected features (feat_numb), a matrix X, in which each column is a feature vector example...".

Comment only
24 Sep 2014 mahyar

mahyar (view profile)

Dear Oswaldo
what is the meaning "15" in Hy=entropia2([y;zeros(1,C)],15)?
Moreover, the Y dimension is not matched with zeros(1,C). because the Dimension of C is equal with number of features whereas the dimension of y is equal the number of input pairs.
So, there is a mismatch dimension to vertcat!
How we can solve this problem?

08 Jun 2014 Dmitry Kaplan

Can you please explain the meaning of the resolucao=15. Why 15?

Comment only
16 Jun 2012 Nermine

if u please send a description for the used GA method,thnx

Comment only
31 May 2012 Alsam

Alsam (view profile)

 
31 May 2012 Alsam

Alsam (view profile)

 
21 May 2012 sam

sam (view profile)

Ali, use transposed X n y, i.e. X',y'.. hope it helps

01 Apr 2012 ali Abusnina

Hi

I am facing difficulty using the code. I am running Matlab 7, on Mac OS. When i call the function I get the following error :

"
??? Error using ==> vertcat
CAT arguments dimensions are not consistent.
Error in ==> statistics at 9
Hy=entropia2([y;zeros(1,C)],15);
Error in ==> GA_feature_selector at 19
[Hx,Hy,MIxy,MIxx]=statistics(X,y);
"

Can anyone help please

Thanks

Comment only
05 Feb 2012 rekoba

rekoba (view profile)

thanks for your code
but plz write an example to run this code
thanks

Comment only
08 Jul 2011 Oswaldo Ludwig

Oswaldo Ludwig (view profile)

Dear Mohamed,

The approach depends on your application, in the case of object detection, the usual approach is to use an image descriptor, eg HOG (see http://www.mathworks.com/matlabcentral/fileexchange/28689-hog-descriptor-for-matlab), before the feature selection.

Comment only
08 Jul 2011 Peer Mohamed

How to use this function for images

24 Mar 2011 CarloG

CarloG (view profile)

 
Updates
11 Sep 2012

Only the description.

Contact us