Person Award 36431

abbas3
Hitachi America Chair in the School of Engineering and Fortinet Founders Chair, Department of Electrical Engineering
Stanford University

Mutual Information

Definitions

Let \(X\) and \(Y\) be discrete random variables defined on finite alphabets \(\mathcal{X}\) \(\mathcal{Y}\), respectively, and with joint probability mass function \(p_{X,Y}\). The mutual information of \(X\) and \(Y\) is the random variable \(I(X,Y)\) defined by

\[ I(X,Y) = \log\frac{p_{X,Y}(X,Y)}{p_X(X)p_Y(Y)}.\]

As with entropy, the base of the logarithm defines the units of mutual information.  If the if the logarithm is to the base \(e\), the unit of entropy is the nat.

Entropy

Definitions

Let \(X\) be a discrete random variable defined on a finite alphabet \(\mathcal{X}\) and with probability mass function \(p_X\). The entropy of \(X\) is the random variable \(H(X)\) defined by

\[ H(X)=\log\frac{1}{p_X(X)}.\] 

The base of the logarithm defines the unit of entropy. If the logarithm is to the base 2, the unit of entropy is the bit. If the if the logarithm is to the base \(e\), the unit of entropy is the nat.