Loading Icon

Mutual information

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable through observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.

Metrics Summary

Total Publications
Lifetime
5,140
Prior Five Years
1,403
Total Citations
Lifetime
103,286
Prior Five Years
9,070
Total Scholars
Lifetime
8,972
Prior Five Years
7,624

Publications and Citation History

Publications based on Disciplines

Scholars based on Disciplines

Publications based on Fields

Scholars based on Fields

Highly Ranked Scholars™

Lifetime
Prior Five Years

Highly Cited Publications

Lifetime