And it’s of interest – does evolutionary nature of the search algorithm has any amenity over simple random search in terms of fitness function evaluations count (=computation time) or quality of solution?

I believe entropy(rand(10000,1)) should be high, cause here we have much uncertainty of values of variable.

entropy(sin(rand(10000,1))) should definitely equal entropy(cos(rand)), and, I suppose, be small, cause we have less information than with pure random signal (albeit also random)?

Good day! thanks for interesting program. Can anyone please explain me why, having entropy(rand(10000,1))=0 (it's ok),
it gives entropy(sin(rand(10000,1)))=0.9978, BUT entropy(cos(rand(10000,1)))=0
???
:-)
I believe entropy(sin(rand(10000,1))) should also equal 0... or not? with cos(rand) or sin(rand) signals we definitely have less information than with pure rand signal?

Thanks for the submission.
I found that the mex files did not work for my computer. (Windows 7 x64). So I had to use the "mex -setup" to build mex files from the .cpp source files.
To do this all I did was edit all "log(2)" notations in all the .cpp files to "log(2.0)". I appreciate this suggestion from the previous comments here.
Hope this helps someone.

Comment only

05 Dec 2014

Mutual Information computation
A self-contained package for computing mutual information, joint/conditional probability, entropy

Comment only