File Exchange

image thumbnail

Mutual Information

version 1.2 (1.33 KB) by

Calculates the mutual information between two discrete variables (or a group and a single variable).

69 Downloads

Updated

View License

MutualInformation: returns mutual information (in bits) of discrete variables 'X' and 'Y'

I = MutualInformation(X,Y);

I = calculated mutual information (in bits)
X = variable(s) to be analyzed (column vector)
Y = variable to be analyzed (column vector)

Note 1: Multiple variables may be handled jointly as columns in
matrix 'X'.
Note 2: Requires the 'Entropy' and 'JointEntropy' functions.

Comments and Ratings (8)

Duo Hao

good job

Batuhan

Those who cannot find Entropy.m and JointEntropy.m files should search the Matlab File Exchange. These files ara available there, by the same author.

Debanjan

where can I find the 'Entropy' and 'JointEntropy' functions? The Mutual Information does not seem to run without these functions

Will Dwinnell

By "discrete", I mean integer values (intended to act as codes for distinct symbols).

Will Dwinnell

This routine is intended for use with discrete variables, not continuous ones.

Jun

Jun (view profile)

I am not sure if I run the code right or there is a bug in the code.

I ran the following and got the answers below. Does not seem right to me. A and B are completely independent random numbers, which I verified.

>> A=rand(1,100);
>> B=rand(1,100);
>> MutualInformation(A',A')

ans =

6.6439

>> MutualInformation(B',B')

ans =

6.6439

>> MutualInformation(A',B')

ans =

6.6439

Good Job

Updates

1.2

I have updated the description to clarify the use of this routine.

MATLAB Release
MATLAB 6.0 (R12)

Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.

» Watch video