# How to do estimate Non-gaussianity using Negentropy?

21 views (last 30 days)

Show older comments

##### 0 Comments

### Accepted Answer

William Rose
on 27 May 2021

You will find several questions and answers about negentropy if you search Matlab Answers for negentropy.

Negentropy is not the reverse of entropy. Suppose your signal is y. The negentropy of the signal, J(y), is the entropy of a Gaussian noise signal with the same mean and variance as y, minus the entropy of y.

where = entropy of y. A Gaussian signal has maximum possible entropy, so J(y) will be non-negative. The bigger J(y) is, the more non-Gaussian y is. You said you already have code to measure the entropy of your signal. Make a Gaussian signal, , with the same mean and variance as y, and use your code to measure its entropy, and the entropy of your signal (y), and compute J(y). Here's how to make a Gaussian signal with the same mean and variance as y:

yGauss=randn(size(y))*std(y)+mean(y);

Good luck.

### More Answers (1)

William Rose
on 28 May 2021

I figured it out. Here is an example of a signal y, and how to calculate its entropy (Hent) and its negentropy (J). You may get the values for y from a file - as in your code, where you read it from a spreadsheet. In this case, I am generating the signal y by calling the uniform random number generator, rand().

N=1024; %signal length

y=sqrt(12)*(rand(1,N)-.5); %uniform random noise, mean=0, stdev=1

h=histogram(y,'Normalization','pdf'); %estimate f(x)=the p.d.f.

Hent=-h.BinWidth*sum(h.Values(h.Values>0).*log(h.Values(h.Values>0))/log(2)); %compute entropy

J=log(std(y))/log(2)+2.0471-Hent; %compute negentropy

fprintf('Uniform random noise: Hent=%.4f, J=%.4f\n',Hent,J); %display results

The attached document explains the rationale for the code above.

I am attaching a script which generates four signals: square wave, sine wave, uniform noise, Gaussian noise. The probability density of each one is displayed (below). It is evident from the plots that the density is most un-Gaussian at the top and gradually changes to Gussian at the bottom. The entropy and negentropy of each signal is computed using the formulas above. The values are as expected: negentropy is biggest for the top plot (most un-Gaussian) and smallest (approximately zero) for the bottom. The signals all have the same mean and standard deviation.

### See Also

### Categories

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!