File Exchange

image thumbnail

Kernel estimate for (Conditional) Mutual Information

version 1.0.0.0 (4.63 KB) by Mikhail
Estimates Mutual Information and Conditional Mutual Information between continuous random variables

5 Downloads

Updated 09 Apr 2011

View License

Mutual information I(X,Y) measures the degree of dependence (in terms of probability theory) between two random variables X and Y. Is is non-negative and equal to zero when X and Y are mutually independent. Conditional mutual information I(X,Y|Z) is the expected value of I(X,Y) given the value of Z.

Data is first copula-transformed, then marginal and joint probability distributions are estimated using Gaussian kernels.

Useful in construction and verification of gene regulatory networks (see e.g. http://www.biomedcentral.com/1471-2105/7/S1/S7) given gene expression data. This quantity is robust and can trace non-linear dependencies and indirect interactions in data.

Cite As

Mikhail (2021). Kernel estimate for (Conditional) Mutual Information (https://www.mathworks.com/matlabcentral/fileexchange/30998-kernel-estimate-for-conditional-mutual-information), MATLAB Central File Exchange. Retrieved .

Comments and Ratings (11)

凡 张

thanks for your sharing,it's very helping.but i want to ask how to set the kernel width?

Jayson

Could some one please tell me what "ind - subset of data on which to estimate MI" means? If I have single column vector y and single column vector x, how can I decide ind, if I want to calculate the MI between y and x data sets. Thanks

Myrtle42

Are the units of the output in bits, or log base e?

Shoubo

Thanks a lot for the great work :)

Frank

Is there a paper to reference for the use of these functions?

Thanks for the great work!

yelei

mohammed ali

Kyle

Hi Mikhail! Thank you for the submisssion, it's a clearly written function that works like charm!

I was wondering if you could refer me to an article on how to set kernel width? I see that you provide a default method, which is pretty good already, but I want to learn more. Thanks!

Mikhail

If you have a vector of states X, build a histogram of it p=hist(X, number_of_states)/number_of_states. Then simply use h=-p*log(p)'

Liu Guo

Thank you for sharing it,And i want to kown how to compute entropy for one vector?

MATLAB Release Compatibility
Created with R2008b
Compatible with any release
Platform Compatibility
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!