File Exchange

## Mutual Information

version 1.2 (1.33 KB) by

Calculates the mutual information between two discrete variables (or a group and a single variable).

Updated

MutualInformation: returns mutual information (in bits) of discrete variables 'X' and 'Y'

I = MutualInformation(X,Y);

I = calculated mutual information (in bits)
X = variable(s) to be analyzed (column vector)
Y = variable to be analyzed (column vector)

Note 1: Multiple variables may be handled jointly as columns in
matrix 'X'.
Note 2: Requires the 'Entropy' and 'JointEntropy' functions.

Duo Hao

good job

Batuhan

### Batuhan (view profile)

Those who cannot find Entropy.m and JointEntropy.m files should search the Matlab File Exchange. These files ara available there, by the same author.

Debanjan

### Debanjan (view profile)

where can I find the 'Entropy' and 'JointEntropy' functions? The Mutual Information does not seem to run without these functions

Will Dwinnell

### Will Dwinnell (view profile)

By "discrete", I mean integer values (intended to act as codes for distinct symbols).

Will Dwinnell

### Will Dwinnell (view profile)

This routine is intended for use with discrete variables, not continuous ones.

Jun

### Jun (view profile)

I am not sure if I run the code right or there is a bug in the code.

I ran the following and got the answers below. Does not seem right to me. A and B are completely independent random numbers, which I verified.

>> A=rand(1,100);
>> B=rand(1,100);
>> MutualInformation(A',A')

ans =

6.6439

>> MutualInformation(B',B')

ans =

6.6439

>> MutualInformation(A',B')

ans =

6.6439

Marcello Costa

### Marcello Costa (view profile)

Good Job

 3 Jan 2012 1.2 I have updated the description to clarify the use of this routine.
MATLAB 6.0 (R12)