You are now following this Submission
- You will see updates in your followed content feed
- You may receive emails, depending on your communication preferences
Mutual information I(X,Y) measures the degree of dependence (in terms of probability theory) between two random variables X and Y. Is is non-negative and equal to zero when X and Y are mutually independent. Conditional mutual information I(X,Y|Z) is the expected value of I(X,Y) given the value of Z.
Data is first copula-transformed, then marginal and joint probability distributions are estimated using Gaussian kernels.
Useful in construction and verification of gene regulatory networks (see e.g. http://www.biomedcentral.com/1471-2105/7/S1/S7) given gene expression data. This quantity is robust and can trace non-linear dependencies and indirect interactions in data.
Cite As
Mikhail (2026). Kernel estimate for (Conditional) Mutual Information (https://uk.mathworks.com/matlabcentral/fileexchange/30998-kernel-estimate-for-conditional-mutual-information), MATLAB Central File Exchange. Retrieved .
General Information
- Version 1.0.0.0 (4.63 KB)
MATLAB Release Compatibility
- Compatible with any release
Platform Compatibility
- Windows
- macOS
- Linux
| Version | Published | Release Notes | Action |
|---|---|---|---|
| 1.0.0.0 |
