File Exchange

image thumbnail


version (1.55 KB) by Will Dwinnell
Calculates the sample entropy, in bits, of discrete variables.


Updated 12 Sep 2010

View License

Entropy: Returns entropy (in bits) of each column of 'X'

by Will Dwinnell

H = Entropy(X)

H = row vector of calculated entropies (in bits)
X = data to be analyzed

Note 1: Each distinct value in X is considered a unique value.

Note 2: Estimated entropy values are slightly less than true,
due to finite sample size.


X = ceil(repmat([2 4 8 16],[1e3,1]) .* rand(1e3,4));

Comments and Ratings (5)

piya dung

Very Goods

Julia Ferre

Do you think we could used this code for images? (the images should be considered as 1D vectors, of course).

Oh, I would just like to note that the below code will only work for 1d vectors, ie. Nx1 or 1xN.

I had serious trouble with the performance of this entropy calculation method. For about 500k samples it takes about 20 seconds to compute the entropy. Here is an alternative entropy function I just wrote up for integer signal values (ie. y = [1 -6011 -3000 2592]):

function [ent] = EntropyInt(y)
% Calculate the entropy for an integer value of y

% First verify that y is truely integer-valued
Sum = sum(y);
if( Sum ~= round(Sum) )
error('INTEGER_ENTROPY:InvalidInput', 'Input arguments must be of integer value.')

% Generate the histogram
[n x] = hist(y, double(min(y):max(y)));

% Normalize the area of the histogram to make it a pdf
n = n / sum(n);

% Calculate the entropy
indices = n ~= 0;
ent = -sum(n(indices).*log2(n(indices)));

MATLAB Release Compatibility
Created with R12
Compatible with any release
Platform Compatibility
Windows macOS Linux

Discover Live Editor

Create scripts with code, output, and formatted text in a single executable document.

Learn About Live Editor