Kernel estimate for (Conditional) Mutual Information
Mutual information I(X,Y) measures the degree of dependence (in terms of probability theory) between two random variables X and Y. Is is non-negative and equal to zero when X and Y are mutually independent. Conditional mutual information I(X,Y|Z) is the expected value of I(X,Y) given the value of Z.
Data is first copula-transformed, then marginal and joint probability distributions are estimated using Gaussian kernels.
Useful in construction and verification of gene regulatory networks (see e.g. http://www.biomedcentral.com/1471-2105/7/S1/S7) given gene expression data. This quantity is robust and can trace non-linear dependencies and indirect interactions in data.
Cite As
Mikhail (2026). Kernel estimate for (Conditional) Mutual Information (https://uk.mathworks.com/matlabcentral/fileexchange/30998-kernel-estimate-for-conditional-mutual-information), MATLAB Central File Exchange. Retrieved .
MATLAB Release Compatibility
Platform Compatibility
Windows macOS LinuxCategories
- Signal Processing > Signal Processing Toolbox > Measurements and Feature Extraction > Descriptive Statistics >
Tags
Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
KernelMI/
| Version | Published | Release Notes | |
|---|---|---|---|
| 1.0.0.0 |
