View License

Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.

» Watch video

Highlights from
Information Theory Toolbox

  • condEntropyCompute conditional entropy z=H(x|y) of two discrete variables x and y.
  • entropy(x)Compute entropy z=H(x) of a discrete variable x.
  • jointEntropy(x, y)Compute joint entropy z=H(x,y) of two discrete variables x and y.
  • mutInfo(x, y)Compute mutual information I(x,y) of two discrete variables x and y.
  • nmi(x, y)Compute normalized mutual information I(x,y)/sqrt(H(x)*H(y)) of two discrete variables x and y.
  • nvi(x, y)Compute normalized variation information z=(1-I(x,y)/H(x,y)) of two discrete variables x and y.
  • relatEntropy
  • demo.mdemos for ch01
  • View all files

Join the 15-year community celebration.

Play games and win prizes!

» Learn more

4.0
4.0 | 5 ratings Rate this file 89 Downloads (last 30 days) File Size: 4.25 KB File ID: #35625 Version: 1.0

Information Theory Toolbox

by

Mo Chen (view profile)

 

13 Mar 2012 (Updated )

Functions for Information theory, such as entropy, mutual information, KL divergence, etc

| Watch this File

File Information
Description

This toolbox contains functions for DISCRETE random variables to compute following quantities:
1)Entropy
2)Joint entropy
3)Conditional entropy
4)Relative entropy (KL divergence)
5)Mutual information
6)Normalized mutual information
7)Normalized variation information
This package is now a part of the PRML toolbox (http://www.mathworks.com/matlabcentral/fileexchange/55826-pattern-recognition-and-machine-learning-toolbox).

Acknowledgements

Normalized Mutual Information and Pattern Recognition And Machine Learning Toolbox inspired this file.

MATLAB release MATLAB 7.13 (R2011b)
MATLAB Search Path
/
/InfoTheory
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (15)
05 Nov 2015 Karel Mundnich

In the conditional entropy, you cannot calculate the joint distribution from marginal distributions. The joint distribution should be one of the arguments of the function.

18 Sep 2015 Romesh

Romesh (view profile)

I don't think either of the proposed solutions provided by Francesco and Subash are correct. If you have

a=randint(1,1000,[1 5]);
entropy(a)
mutualInformation(a,a)

we know that mathematically these must give the same result. The original code does, whereas Francesco's change doesn't. So simply reversing the order is incorrect.

The underlying error is that the code expects x and y to be positive integers. Rounding a continuous variable will mean that you have valid indexes (except if the input has a value that rounds to zero). However, you could consider this as being analogous to binning the data, except that if multiple points go into the same bin, that bin will only ever have a value of 1. So I suspect Subash's suggestion also invalidates the calculation.

The real answer is actually provided by the author in the package description: "This toolbox contains functions for discrete random variables". These functions should only be used for DISCRETE variables x and y that contain positive integers. A different approach must be used if one or both of the variables is continuous.

Comment only
02 Sep 2015 Arvind

Arvind (view profile)

Hey guys, regarding sparse function error, which answer is correct of the below (as answered by francesco and subash)--

Mx=sparse(idx,1,x,n,k,n);

Or

Mx = sparse(idx, round(x), 1,n,k,n);

Both of these give different result so only one of them should be correct.

Comment only
10 Dec 2014 Francesco Onorati

to avoid the error when calling sparse function, just invert x (and y) with 1.
Mx=sparse(idx,1,x,n,k,n);
Please next time refer to the help of sparse function before asking :)

Comment only
04 Dec 2014 Anuja Kelkar

Is the output of the conditionalEntropy function a normalized value? I ask this because, I computed conditional entropy myself with the aid of MutualInformation function and MATLAB's entropy() method. I had got values of conditional Entropy to be greater than 1, which was expected. However, I am getting all conditional entropy values < 1 using InfoTheory toolbox's conditonalEntropy() function.

Has the output been normalized?
Please let me know. Thanks

Comment only
07 Oct 2014 Partha

Partha (view profile)

I got different result using entropy(sig) and wentropy(sig,'shannon'). Can any one explain this?

Comment only
19 Jul 2014 Subash Padmanaban

Change line 16:
Mx = sparse(idx, round(x), 1,n,k,n);

Apply the same changes to all sparse operations if the program throws the same error.

Comment only
10 Dec 2013 Nejc Ilc

Very useful and efficient toolbox, thank you. However, there is a bug in the nmi.m. last sentence should read:
z = sqrt((MI/Hx)*(MI/Hy));
Output variable is "z" and not "v". But this is obvious a typo, so it does not influence my rating.

25 May 2013 shi

shi (view profile)

??? Error using ==> sparse
Index into matrix must be an integer.

Error in ==> mutualInformation at 16
Mx = sparse(idx,x,1,n,k,n);

is there anybody can help me?

Comment only
27 Nov 2012 Maksim

Maksim (view profile)

Take back my last comment.

Comment only
27 Nov 2012 Maksim

Maksim (view profile)

 
27 Nov 2012 Maksim

Maksim (view profile)

nmi(1:1000,randi(1000,1,1e3))

returns 0.96

nmi(randi(1000,1,1e3),randi(1000,1,1e3))

returns 0.91

Are you sure this is working correctly?

17 Sep 2012 Zulkifli Hidayat

In Windows 7 64-bit and using Matlab 2011b 64-bit, I got error for the following simple code:

x = randn(10,1);
y = randn(10,1);
z = mutualInformation(x,y)

Error message:

Error using sparse
Sparse matrix sizes must be non-negative integers less than MAXSIZE as defined by
COMPUTER. Use HELP COMPUTER for more details.

Error in mutualInformation (line 16)
Mx = sparse(idx,x,1,n,k,n);

Is there anybody tell me why?

Comment only
12 May 2012 Jeff

Jeff (view profile)

 
03 Apr 2012 ali Abusnina

Is there anyway how can we make these measure on time cries data?
can anyone help please

Comment only
Updates
07 Mar 2016 1.0

minor tweak, more demos

Contact us