Reliable and extremely fast kernel density estimator for onedimensional data
Reliable and extremely fast kernel density estimator for onedimensional data;
Gaussian kernel is assumed and the bandwidth is chosen automatically;
Unlike many other implementations, this one is immune to problems
caused by multimodal densities with widely separated modes (see example). The
estimation does not deteriorate for multimodal densities, because we never assume
a parametric model for the data (like those used in rules of thumb).
INPUTS:
data  a vector of data from which the density estimate is constructed;
n  the number of mesh points used in the uniform discretization of the
interval [MIN, MAX]; n has to be a power of two; if n is not a power of two, then
n is rounded up to the next power of two, i.e., n is set to n=2^ceil(log2(n));
the default value of n is n=2^12;
MIN, MAX  defines the interval [MIN,MAX] on which the density estimate is constructed;
the default values of MIN and MAX are:
MIN=min(data)Range/10 and MAX=max(data)+Range/10, where Range=max(data)min(data);
OUTPUTS:
bandwidth  the optimal bandwidth (Gaussian kernel assumed);
density  column vector of length 'n' with the values of the density
estimate at the grid points;
xmesh  the grid over which the density estimate is computed;
 If no output is requested, then the code automatically plots a graph of
the density estimate.
cdf  column vector of length 'n' with the values of the cdf
Reference:
Kernel density estimation via diffusion
Z. I. Botev, J. F. Grotowski, and D. P. Kroese (2010)
Annals of Statistics, Volume 38, Number 5, pages 29162957
doi:10.1214/10AOS799
Example (run in command window):
data=[randn(100,1);randn(100,1)*2+35 ;randn(100,1)+55];
kde(data,2^14,min(data)5,max(data)+5);
1.5  corrected the title back to "kernel density estimator" ; updated reference 

1.5  bug fixes: 1) in some rare cases with small 'n', fzero used to fail; code now deals with these failures;


1.5   the updated version provides additionally a cdf estimator as an output argument


1.4  Published in the Annals of Statistics, 2010, see Section 5.


1.3  As pointed out by Dazhi Jiang in the comments section, the healine


1.1  updated the reference  now a journal paper submitted to the Annals of Statistics 
Inspired: SimOutUtils, hcoefficient, Kernel Density Estimator for High Dimensions
Clark Taylor (view profile)
Quick bug. If you only ask for one output (the bandwidth), the code throws an error. The problem is the line
"density(density<0)=eps; % remove negatives due to roundoff error"
By moving this line within the "if (nargout>1)(nargout==0)" statement, I was able to solve the problem and the code appears to be working well.
Konstantinos Tsitsilonis (view profile)
Thank you for this function!
Is there any way to calculate any performance parameter of the distribution, i.e. the ISE, MISE etc.?
Kirill (view profile)
Karl_the_EE_guy (view profile)
Karl_the_EE_guy (view profile)
Thank you, learned a lot today from papaer, really appreciate it!
Stefano Rognoni (view profile)
Nuno Fachada (view profile)
Rashid Mehmood (view profile)
Hello, Every body !
i am new in matlab. i am estimating density of 100 data points but it return density of 128 * 128 matrix . how to sum up to get only density of desired 100 data points.
Arpad Rozsas (view profile)
Marc Lalancette (view profile)
Thanks, very useful. Strangely I get very different results on Matlab 2011b and 2013b with the same data. On the recent version, the density distribution is more smooth and has a stronger tendency to not go to 0 at the ends of the distribution. I'm guessing this is due to changes in a Matlab function. Any ideas?
Genevieve (view profile)
Aniket Deshmukh (view profile)
lefteris (view profile)
Dear Botev,
my data does not have meaning on negative values, but constructing histograms using kde returns frequencies on negative values and even if I determine the lower limit of x on zero, it returns on zero a big value. (i expect my histogram to start like x^2).
Thanks
Bharath Ramesh (view profile)
Dear Botev,
I have encountered a problem with your implementation and seeking your help. The PDFs obtained using translated versions of the signal (image histogram, in this case) is not the same.
data = [23 23 23 22 22 22 21 22 23];
data = [53 53 53 52 52 52 51 52 53];
MIN = 0
MAX = 255
n = 256;
[bandwidth,density,xmesh,cdf]=kde(data,256,MIN,MAX)
This gives a good unimodal estimate, whereas the second one is incomprehensible.
[bandwidth,density,xmesh,cdf]=kde(data2,256,MIN,MAX)
Please take a look at the density plots in each case.
This might be a problem with the bandwidth estimation but I don't know how to solve it.
Any help is appreciated.
Bharath
For e.g:
Nicolas Beuchat (view profile)
Brilliant! Saves me a lot of computation time and I gain in precision :)
Thanks!
Jiarui (view profile)
Jiarui (view profile)
Hi Steven. It is the integral of the pdf function should be 1. So if your xinterval is very small, then the yvalue of the pdf function could be larger than 1. E.g. A uniform distribution on x=[0,0.01]. Then y need to be 100 to make the integral 1.
Steven Millard (view profile)
I am using Botev tools and do not understand why the density function has values greater than one. I am knew to KDE and don't understand this yet. I figue a density function is suppose to add up to 1 when you integrate it?
Thanks,
Jiang (view profile)
good
NGO NHAT KHOA (view profile)
Hi all, I have a problem with pdf estimate that needs your help. I am asked to verify that the probability of rv Z (number of trials) given that there are exactly 2 successes is a negative binomial distribution. I generate random numbers many times and record the number of trials required for 2 successes everytime. Then, I find the frequency at which each values of Z occur (Ex: repeating 1000 times the experiment, I find that only 60 times that the number of trials is 10 for exact 2 successes to occur, hence the frequency is 60/1000 = 0.06). After that, I try to estimate the pdf of Z using Kernel and compare with the plot by using nbinpdf available in MATLAB but the result is so terrible. I'm thinking of using kde function but do not know how to use. I really aprreciate your help, please.
Amir (view profile)
Gabriel (view profile)
Zdravko's kernel density estimator works a lot more quicker than traditional methods although I am getting spurious artifacts due to too low a bandwidth selected of 0.02 (a third smaller than when i used another selector which minimised expected L2 loss between estimate and underlying). The latter bandwidth works smoothly but takes a bit longer. Also, I get negative densities at the outliers so I adjusted the minmax boundaries. Is there a way to alter the estimator to avoid this issue?
Zhang (view profile)
hi, it's a really a fast and robust script. I have a question about what the time complexity (in terms of data size n) is, namely O(n) or O(n^2)? Could someone provide some time complexity analysis ? Great thanks~
Rory Staunton (view profile)
Question: is there any way to incorporate observation weights? I have calculated a weight based on other considerations (measurement error and goodness of fit, e.g.) for each data point in my distribution and want to incorporate this into the density estimate. Thanks in advance.
Babak Abbasi (view profile)
Thanks, It helps a lot.
Mi Matthew (view profile)
Thanks a lot!It works very well.
Ekaterina Zaytseva (view profile)
Exellent script. Very fast and efficient. I have a question is there a similar script for a mdimentional data (with m>2)?
James Ford (view profile)
My apologies... I think the 13 Jan 2011 update fixed that crash (the 100 length vector now works).
James Ford (view profile)
This has been excellent in general. A few times it has crashed at line 57:
t_star=fzero(@(t)fixed_point(t,N,I,a2),[0,.1]);
because "??? Error using ==> fzero at 283
The function values at the interval endpoints must differ in sign."
The data doesn't look obviously bad in these cases. A short example vector is [14.0534 13.2851 13.0951 13.1159 14.2221] (this is a shortened version of a length 100 vector that also crashed).
Zdravko Botev (view profile)
Due to numerical roundoff error from the fft.m function, it is possible to get density values of 1.38e018 (instead of 0) and cdf values slightly larger than 1.
If this is a problem, one can correct the output from kde by overwriting:
density=max(density,0); cdf=min(1,cdf);
dk (view profile)
The author fixed the bug and it works without a problem. Good job!
dk (view profile)
The code crashes at line 57 when length(data) is small. e.g kde(rand(100,1)) or kde(randn(30,1)).
Zdravko Botev (view profile)
Dear George, the kde function works as it should. There is no problem with the kde. What you call a problem is actually one of the main strengths of the routine.
By typing data = [d1;d1;d1;d1;d1;d2;d3];
you are creating DISCRETE data, because you create ties (the same values appear multiple times). For a truly continuous data, there can be no ties or repeated values!!!
If you have ties, then the data CANNOT be continuous be definition.
The kde.m CORRECTLY recognizes that the data you have provided is perfectly discrete and since discrete data does not need smoothing, the selected bandwidth should be zero. kde.m is the only routine I am aware of that does this correctly, every other routine fails this BASIC theoretical test.
george. holzwarth (view profile)
Botev's kernel density estimator works admirably for me, except with weighted data, where the bandwidth selector "fails".
Consider
d1=randn(25,1)+5;
d2=randn(50,1)+8;
d3=randn(100,1)+11;
data=[d1;d2;d3];
kde finds a bandwidth of about 0.6, which is reasonable.
Now weight the first Gaussian 5 times:
data = [d1;d1;d1;d1;d1;d2;d3];
kde now finds a bandwidth of 0.001, which is not reasonable.
Is there a way to enter weighted data sets or change the bandwidth estimator to avoid this problem?
Thanks.
Ángel yustres (view profile)
Extremely fast and easy to use.
star (view profile)
Could someone provide me with a code for nonparametric bayesian density estimation using a dirichlet prior? I am stuck.
Ange (view profile)
Fantastic script  fast and easy to use!
oluwole ogunkelu (view profile)
can someone provide me with hierarchical token bucket(HTB) algorithm used to optimize bandwidth? kinda stucked
David (view profile)
i am using your above code and my data is plotting density values well over 1 (i.e. >500). I looked at your example
% Example:
% data=[randn(100,1);randn(100,1)*2+35 ;randn(100,1)+55];
% [bandwidth,density,xmesh]=kde(data,2^14,min(data)5,max(data)+5);
% plot(xmesh,density)
but even then, sum(density) = 235.6368, which obviously is greater than 1. It should be 1 if it's a pdf right?
So, does your code generate a pdf? Or is scaled in some other way? If it is not a pdf, do you know how to convert it to a pdf? (do you just normalize by sum(density) ?)
Dazhi Jiang (view profile)
I think the new version just missed the heading line. Please check it. But still good job. Thank.
Karin Petrini (view profile)
Michael Jordan (view profile)
red (view profile)
Yes, the method seems to scale the function so that it becomes a pdf. But my data do not represent a pdf. How can I modify the method so that it works for general (nondensity) estimation?
Nathanael Yoder (view profile)
I was incorrect but there does seem to be a scale factor on the density functions
Nathanael Yoder (view profile)
Great code but I believe line 83 should be:
density=idct1d(a_t)/N;
instead of:
density=idct1d(a_t)/R;
in order to get an accurately scaled density function.
Thanks for sharing this code. However using Matlab 6.5R13 I had to debug it: inputs arguments I,a2,N not specified for function fixed_point (for example).
Not Bad, But this program is only available for 1d data. But it is still useful some questions . In any way Thanks for sharing.
thanks a lot! it's good.
Highly recommend this! Very fast and robust.
Check this out. Much better than the currently available density estimation procedures!