Paper award 36864
ÌÇÐÄlogo Transactions on Information Theory
Paper award 36863
ÌÇÐÄlogo Transactions on Information Theory
Paper award 36843
ÌÇÐÄlogo Transactions on Information Theory
Paper award 36842
ÌÇÐÄlogo Transactions on Information Theory
Basic Notions
- Entropy
- Differential entropy
- Graph entropy
- Conditional entropy
- Mutual info
Mutual Information
Definitions
Let \(X\) and \(Y\) be discrete random variables defined on finite alphabets \(\mathcal{X}\) \(\mathcal{Y}\), respectively, and with joint probability mass function \(p_{X,Y}\). The mutual information of \(X\) and \(Y\) is the random variable \(I(X,Y)\) defined by
\[ I(X,Y) = \log\frac{p_{X,Y}(X,Y)}{p_X(X)p_Y(Y)}.\]