Code covered by the BSD License  

Highlights from
Information Theory Toolbox

5.0
5.0 | 4 ratings Rate this file 196 Downloads (last 30 days) File Size: 3.95 KB File ID: #35625

Information Theory Toolbox

by

Mo Chen (view profile)

 

Functions for Information theory, such as entropy, mutual information, KL divergence, etc

| Watch this File

File Information
Description

This toolbox contains functions for discrete random variables to compute following quantities:
1)Entropy
2)Joint entropy
3)Conditional entropy
4)Relative entropy (KL divergence)
5)Mutual information
6)Normalized mutual information
7)Normalized variation information

This toolbox is a tweaked and bundled version of my previous submissions.
Note( previous single function submission will be removed soon).

Acknowledgements

Mutual Information and Normalized Mutual Information inspired this file.

MATLAB release MATLAB 7.13 (R2011b)
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (12)
10 Dec 2014 Francesco Onorati

to avoid the error when calling sparse function, just invert x (and y) with 1.
Mx=sparse(idx,1,x,n,k,n);
Please next time refer to the help of sparse function before asking :)

Comment only
04 Dec 2014 Anuja Kelkar

Is the output of the conditionalEntropy function a normalized value? I ask this because, I computed conditional entropy myself with the aid of MutualInformation function and MATLAB's entropy() method. I had got values of conditional Entropy to be greater than 1, which was expected. However, I am getting all conditional entropy values < 1 using InfoTheory toolbox's conditonalEntropy() function.

Has the output been normalized?
Please let me know. Thanks

Comment only
07 Oct 2014 Partha

Partha (view profile)

I got different result using entropy(sig) and wentropy(sig,'shannon'). Can any one explain this?

Comment only
19 Jul 2014 Subash Padmanaban

Change line 16:
Mx = sparse(idx, round(x), 1,n,k,n);

Apply the same changes to all sparse operations if the program throws the same error.

Comment only
10 Dec 2013 Nejc Ilc

Very useful and efficient toolbox, thank you. However, there is a bug in the nmi.m. last sentence should read:
z = sqrt((MI/Hx)*(MI/Hy));
Output variable is "z" and not "v". But this is obvious a typo, so it does not influence my rating.

25 May 2013 shi

shi (view profile)

??? Error using ==> sparse
Index into matrix must be an integer.

Error in ==> mutualInformation at 16
Mx = sparse(idx,x,1,n,k,n);

is there anybody can help me?

Comment only
27 Nov 2012 Maksim

Maksim (view profile)

Take back my last comment.

Comment only
27 Nov 2012 Maksim

Maksim (view profile)

 
27 Nov 2012 Maksim

Maksim (view profile)

nmi(1:1000,randi(1000,1,1e3))

returns 0.96

nmi(randi(1000,1,1e3),randi(1000,1,1e3))

returns 0.91

Are you sure this is working correctly?

17 Sep 2012 Zulkifli Hidayat

In Windows 7 64-bit and using Matlab 2011b 64-bit, I got error for the following simple code:

x = randn(10,1);
y = randn(10,1);
z = mutualInformation(x,y)

Error message:

Error using sparse
Sparse matrix sizes must be non-negative integers less than MAXSIZE as defined by
COMPUTER. Use HELP COMPUTER for more details.

Error in mutualInformation (line 16)
Mx = sparse(idx,x,1,n,k,n);

Is there anybody tell me why?

Comment only
12 May 2012 Jeff

Jeff (view profile)

 
03 Apr 2012 ali Abusnina

Is there anyway how can we make these measure on time cries data?
can anyone help please

Comment only

Contact us