Rank: 962 based on 134 downloads (last 30 days) and 6 files submitted
photo

David Fass

E-mail

Personal Profile:

 

Watch this Author's files

 

Files Posted by David Fass View all
Updated   File Tags Downloads
(last 30 days)
Comments Rating
03 Dec 2007 KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions. Author: David Fass coding theory, information theory, kullbackleibler, jensenshannon, divergence 91 6
  • 3.83333
3.8 | 6 ratings
02 Nov 2006 MUTUALINFO Multiple mutual information (interaction information) Author: David Fass coding theory, information theory, information, entropy, mutual information, interact 9 1
  • 5.0
5.0 | 1 rating
02 Nov 2006 ENTROPY Compute the Shannon entropy of a set of variables. Author: David Fass coding theory, information theory, information, entropy, joint entropy, marginal 9 0
23 Oct 2006 INRANGE INRANGE tests if values are within a specified range (interval). Author: David Fass in range, in interval, test range, test interval, range 14 1
  • 3.0
3.0 | 1 rating
10 Oct 2006 initialcaps Convert a string to initial caps format (initial capitals on all words). Author: David Fass strings, initial capital, initial capitals, capitalization, upper case, lower c 7 1
  • 5.0
5.0 | 1 rating
Comments and Ratings on David Fass' Files View all
Updated File Comment by Comments Rating
12 Oct 2012 KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions. Author: David Fass Wajahat

Thanks a lot. Its a good implementation

16 Feb 2012 KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions. Author: David Fass Tomasz Galka

18 Oct 2011 KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions. Author: David Fass Omid G

23 Sep 2010 MUTUALINFO Multiple mutual information (interaction information) Author: David Fass György Kovács

Thanks, I needed this function. Although at line 142 I've found the following: "entropy(subObjMat,pVect);"
Matlab on the other hand expects only one input argument for the function entropy. Because of this, execution stops with an error message. Could someone please help me how to work around this problem?
Thanks in advance.

01 Jun 2010 KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions. Author: David Fass Dushyant Kumar

Thanks for writing KLD for two discrete random variables and thanks for sharing.

Contact us