Mutual information I(X,Y) measures the degree of dependence (in terms of probability theory) between two random variables X and Y. Is is non-negative and equal to zero when X and Y are mutually independent. Conditional mutual information I(X,Y|Z) is the expected value of I(X,Y) given the value of Z.
Data is first copula-transformed, then marginal and joint probability distributions are estimated using Gaussian kernels.
Useful in construction and verification of gene regulatory networks (see e.g. http://www.biomedcentral.com/1471-2105/7/S1/S7) given gene expression data. This quantity is robust and can trace non-linear dependencies and indirect interactions in data.
Mikhail (2021). Kernel estimate for (Conditional) Mutual Information (https://www.mathworks.com/matlabcentral/fileexchange/30998-kernel-estimate-for-conditional-mutual-information), MATLAB Central File Exchange. Retrieved .
thanks for your sharing,it's very helping.but i want to ask how to set the kernel width?
Could some one please tell me what "ind - subset of data on which to estimate MI" means? If I have single column vector y and single column vector x, how can I decide ind, if I want to calculate the MI between y and x data sets. Thanks
Are the units of the output in bits, or log base e?
Thanks a lot for the great work :)
Is there a paper to reference for the use of these functions?
Thanks for the great work!
Hi Mikhail! Thank you for the submisssion, it's a clearly written function that works like charm!
I was wondering if you could refer me to an article on how to set kernel width? I see that you provide a default method, which is pretty good already, but I want to learn more. Thanks!
If you have a vector of states X, build a histogram of it p=hist(X, number_of_states)/number_of_states. Then simply use h=-p*log(p)'
Thank you for sharing it,And i want to kown how to compute entropy for one vector?
Find the treasures in MATLAB Central and discover how the community can help you!Start Hunting!