Rank: 1150 based on 104 downloads (last 30 days) and 6 files submitted
photo

David Fass

E-mail

Personal Profile:
Professional Interests:

 

Watch this Author's files

 

Files Posted by David View all
Updated   File Tags Downloads
(last 30 days)
Comments Rating
03 Dec 2007 KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions. Author: David Fass coding theory, information theory, kullbackleibler, jensenshannon, divergence 70 6
  • 3.83333
3.8 | 6 ratings
02 Nov 2006 MUTUALINFO Multiple mutual information (interaction information) Author: David Fass coding theory, information theory, information, entropy, mutual information, interact 11 1
  • 5.0
5.0 | 1 rating
02 Nov 2006 ENTROPY Compute the Shannon entropy of a set of variables. Author: David Fass coding theory, information theory, information, entropy, joint entropy, marginal 10 0
23 Oct 2006 INRANGE INRANGE tests if values are within a specified range (interval). Author: David Fass in range, in interval, test range, test interval, range 8 1
  • 3.0
3.0 | 1 rating
10 Oct 2006 initialcaps Convert a string to initial caps format (initial capitals on all words). Author: David Fass strings, initial capital, initial capitals, capitalization, upper case, lower c 4 1
  • 5.0
5.0 | 1 rating
Comments and Ratings on David's Files View all
Updated File Comment by Comments Rating
12 Oct 2012 KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions. Author: David Fass Wajahat

Thanks a lot. Its a good implementation

16 Feb 2012 KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions. Author: David Fass Galka, Tomasz

18 Oct 2011 KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions. Author: David Fass G, Omid

23 Sep 2010 MUTUALINFO Multiple mutual information (interaction information) Author: David Fass Kovács, György

Thanks, I needed this function. Although at line 142 I've found the following: "entropy(subObjMat,pVect);"
Matlab on the other hand expects only one input argument for the function entropy. Because of this, execution stops with an error message. Could someone please help me how to work around this problem?
Thanks in advance.

01 Jun 2010 KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions. Author: David Fass Kumar, Dushyant

Thanks for writing KLD for two discrete random variables and thanks for sharing.

Contact us