Kernel estimate for (Conditional) Mutual Information

Version 1.0.0.0 (4.63 KB) by Mikhail
Estimates Mutual Information and Conditional Mutual Information between continuous random variables
3.5K Downloads
Updated 9 Apr 2011

View License

Mutual information I(X,Y) measures the degree of dependence (in terms of probability theory) between two random variables X and Y. Is is non-negative and equal to zero when X and Y are mutually independent. Conditional mutual information I(X,Y|Z) is the expected value of I(X,Y) given the value of Z.

Data is first copula-transformed, then marginal and joint probability distributions are estimated using Gaussian kernels.

Useful in construction and verification of gene regulatory networks (see e.g. http://www.biomedcentral.com/1471-2105/7/S1/S7) given gene expression data. This quantity is robust and can trace non-linear dependencies and indirect interactions in data.

Cite As

Mikhail (2026). Kernel estimate for (Conditional) Mutual Information (https://uk.mathworks.com/matlabcentral/fileexchange/30998-kernel-estimate-for-conditional-mutual-information), MATLAB Central File Exchange. Retrieved .

MATLAB Release Compatibility
Created with R2008b
Compatible with any release
Platform Compatibility
Windows macOS Linux
Version Published Release Notes
1.0.0.0