MutualInformation: returns mutual information (in bits) of discrete variables 'X' and 'Y'
I = MutualInformation(X,Y);
I = calculated mutual information (in bits)
X = variable(s) to be analyzed (column vector)
Y = variable to be analyzed (column vector)
Note 1: Multiple variables may be handled jointly as columns in
Note 2: Requires the 'Entropy' and 'JointEntropy' functions.
I think there is something wrong! Clearly wrong results!
please someone email me i need help, i want 'Mutual information' and 'Information Gain' full code and also how to access the Datasets. please my email is email@example.com
Can this code generate a mutual information matrix for each feature column to every other feature column?
Those who cannot find Entropy.m and JointEntropy.m files should search the Matlab File Exchange. These files ara available there, by the same author.
where can I find the 'Entropy' and 'JointEntropy' functions? The Mutual Information does not seem to run without these functions
By "discrete", I mean integer values (intended to act as codes for distinct symbols).
This routine is intended for use with discrete variables, not continuous ones.
I am not sure if I run the code right or there is a bug in the code.
I ran the following and got the answers below. Does not seem right to me. A and B are completely independent random numbers, which I verified.
I have updated the description to clarify the use of this routine.
Create scripts with code, output, and formatted text in a single executable document.