File Exchange ## Jensen-Shannon divergence

version 1.0.0.0 (1.1 KB) by Nima Razavi

### Nima Razavi (view profile)

Calculates the Jensen-Shannon divergence between two probability distributions

Updated 15 Jul 2008

The .zip file contains two functions naming JSDiv.m and KLDiv.m

JSDiv.m uses KLDiv.m for calculation of the KL-divergence.

For more information on the divergence you can take a look at the following:

### Mads Kaern (view profile)

The works for me only after making the correction suggested by Kimberly. Note that her correction replaces lines 29-31 in the original KLDiv function. The last part of the original code containing:

% resolving the case when P(i)==0
dist(isnan(dist))=0;

Does anyone know why Jensen-Shannon divergence is not included as a standard option in Matlab's clustering algorithms?

Kimberly

### Kimberly (view profile)

Hi ... thanks very much for writing this code and taking the time to post it! I had some trouble in the case where at least one of the entries of P and Q are both 0. In this case the last line of KLdiv.m:

% resolving the case when P(i)==0
dist(isnan(dist))=0;

sets the divergence to 0, which is clearly not the case e.g. if P = [1 0 1], Q = [0 0 1].

I would suggest changing the last few lines to:

Q = Q ./repmat(sum(Q,2),[1 size(Q,2)]);
P = P ./repmat(sum(P,2),[1 size(P,2)]);
M=log(P./Q);
M(isnan(M))=0;
dist = sum(P.*M,2);
end

which seems to work for me.