File Exchange

image thumbnail

Mutual Information

version 1.2 (1.33 KB) by

Calculates the mutual information between two discrete variables (or a group and a single variable).



View License

MutualInformation: returns mutual information (in bits) of discrete variables 'X' and 'Y'

I = MutualInformation(X,Y);

I = calculated mutual information (in bits)
X = variable(s) to be analyzed (column vector)
Y = variable to be analyzed (column vector)

Note 1: Multiple variables may be handled jointly as columns in
matrix 'X'.
Note 2: Requires the 'Entropy' and 'JointEntropy' functions.

Comments and Ratings (8)

Duo Hao

good job


Those who cannot find Entropy.m and JointEntropy.m files should search the Matlab File Exchange. These files ara available there, by the same author.


where can I find the 'Entropy' and 'JointEntropy' functions? The Mutual Information does not seem to run without these functions

Will Dwinnell

Will Dwinnell (view profile)

By "discrete", I mean integer values (intended to act as codes for distinct symbols).

Will Dwinnell

Will Dwinnell (view profile)

This routine is intended for use with discrete variables, not continuous ones.


Jun (view profile)

I am not sure if I run the code right or there is a bug in the code.

I ran the following and got the answers below. Does not seem right to me. A and B are completely independent random numbers, which I verified.

>> A=rand(1,100);
>> B=rand(1,100);
>> MutualInformation(A',A')

ans =


>> MutualInformation(B',B')

ans =


>> MutualInformation(A',B')

ans =


Good Job



I have updated the description to clarify the use of this routine.

MATLAB Release
MATLAB 6.0 (R12)

Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.

» Watch video

Win prizes and improve your MATLAB skills

Play today