### Highlights from Jensen-Shannon divergence

3.0
3.0 | 1 rating Rate this file 16 Downloads (last 30 days) File Size: 1.1 KB File ID: #20689 Version: 1.0

# Jensen-Shannon divergence

### Nima Razavi (view profile)

13 Jul 2008 (Updated )

Calculates the Jensen-Shannon divergence between two probability distributions

File Information
Description

The .zip file contains two functions naming JSDiv.m and KLDiv.m

JSDiv.m uses KLDiv.m for calculation of the KL-divergence.

For more information on the divergence you can take a look at the following:

MATLAB release MATLAB 7.4 (R2007a)

The works for me only after making the correction suggested by Kimberly. Note that her correction replaces lines 29-31 in the original KLDiv function. The last part of the original code containing:

% resolving the case when P(i)==0
dist(isnan(dist))=0;

Does anyone know why Jensen-Shannon divergence is not included as a standard option in Matlab's clustering algorithms?

15 Sep 2009 Kimberly

### Kimberly (view profile)

Hi ... thanks very much for writing this code and taking the time to post it! I had some trouble in the case where at least one of the entries of P and Q are both 0. In this case the last line of KLdiv.m:

% resolving the case when P(i)==0
dist(isnan(dist))=0;

sets the divergence to 0, which is clearly not the case e.g. if P = [1 0 1], Q = [0 0 1].

I would suggest changing the last few lines to:

Q = Q ./repmat(sum(Q,2),[1 size(Q,2)]);
P = P ./repmat(sum(P,2),[1 size(P,2)]);
M=log(P./Q);
M(isnan(M))=0;
dist = sum(P.*M,2);
end

which seems to work for me.

Comment only