No BSD License  

Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.

» Watch video

Highlights from
Jensen-Shannon divergence

3.0
3.0 | 1 rating Rate this file 16 Downloads (last 30 days) File Size: 1.1 KB File ID: #20689 Version: 1.0

Jensen-Shannon divergence

by

Nima Razavi (view profile)

 

13 Jul 2008 (Updated )

Calculates the Jensen-Shannon divergence between two probability distributions

| Watch this File

File Information
Description

The .zip file contains two functions naming JSDiv.m and KLDiv.m

JSDiv.m uses KLDiv.m for calculation of the KL-divergence.

For more information on the divergence you can take a look at the following:

MATLAB release MATLAB 7.4 (R2007a)
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (2)
24 Jun 2016 Mads Kaern

The works for me only after making the correction suggested by Kimberly. Note that her correction replaces lines 29-31 in the original KLDiv function. The last part of the original code containing:

% resolving the case when P(i)==0
dist(isnan(dist))=0;

Does anyone know why Jensen-Shannon divergence is not included as a standard option in Matlab's clustering algorithms?

15 Sep 2009 Kimberly

Hi ... thanks very much for writing this code and taking the time to post it! I had some trouble in the case where at least one of the entries of P and Q are both 0. In this case the last line of KLdiv.m:

% resolving the case when P(i)==0
dist(isnan(dist))=0;

sets the divergence to 0, which is clearly not the case e.g. if P = [1 0 1], Q = [0 0 1].

I would suggest changing the last few lines to:


Q = Q ./repmat(sum(Q,2),[1 size(Q,2)]);
P = P ./repmat(sum(P,2),[1 size(P,2)]);
M=log(P./Q);
M(isnan(M))=0;
dist = sum(P.*M,2);
end

which seems to work for me.

Comment only

Contact us