File Exchange

image thumbnail

Jensen-Shannon divergence

version 1.0.0.0 (1.1 KB) by Nima Razavi
Calculates the Jensen-Shannon divergence between two probability distributions

9 Downloads

Updated 15 Jul 2008

No License

The .zip file contains two functions naming JSDiv.m and KLDiv.m

JSDiv.m uses KLDiv.m for calculation of the KL-divergence.

For more information on the divergence you can take a look at the following:

Cite As

Nima Razavi (2020). Jensen-Shannon divergence (https://www.mathworks.com/matlabcentral/fileexchange/20689-jensen-shannon-divergence), MATLAB Central File Exchange. Retrieved .

Comments and Ratings (2)

Mads Kaern

The works for me only after making the correction suggested by Kimberly. Note that her correction replaces lines 29-31 in the original KLDiv function. The last part of the original code containing:

% resolving the case when P(i)==0
dist(isnan(dist))=0;

Does anyone know why Jensen-Shannon divergence is not included as a standard option in Matlab's clustering algorithms?

Kimberly

Hi ... thanks very much for writing this code and taking the time to post it! I had some trouble in the case where at least one of the entries of P and Q are both 0. In this case the last line of KLdiv.m:

% resolving the case when P(i)==0
dist(isnan(dist))=0;

sets the divergence to 0, which is clearly not the case e.g. if P = [1 0 1], Q = [0 0 1].

I would suggest changing the last few lines to:


Q = Q ./repmat(sum(Q,2),[1 size(Q,2)]);
P = P ./repmat(sum(P,2),[1 size(P,2)]);
M=log(P./Q);
M(isnan(M))=0;
dist = sum(P.*M,2);
end

which seems to work for me.

Updates

1.0.0.0

BUG_FIXED

MATLAB Release Compatibility
Created with R2007a
Compatible with any release
Platform Compatibility
Windows macOS Linux

home/nrazavi/Desktop/