Paper award 36843
Paper award 36842
Basic Notions
- Entropy
- Differential entropy
- Graph entropy
- Conditional entropy
- Mutual infor
Mutual Information
Definitions
Let \(X\) and \(Y\) be discrete random variables defined on finite alphabets \(\mathcal{X}\) \(\mathcal{Y}\), respectively, and with joint probability mass function \(p_{X,Y}\). The mutual information of \(X\) and \(Y\) is the random variable \(I(X,Y)\) defined by
\[ I(X,Y) = \log\frac{p_{X,Y}(X,Y)}{p_X(X)p_Y(Y)}.\]
As with entropy, the base of the logarithm defines the units of mutual information. If the if the logarithm is to the base \(e\), the unit of entropy is the nat.
Entropy
Definitions
\[ H(X)=\log\frac{1}{p_X(X)}.\]
The base of the logarithm defines the unit of entropy. If the logarithm is to the base 2, the unit of entropy is the bit. If the if the logarithm is to the base \(e\), the unit of entropy is the nat.
Information Theory Knowledge Database
- Basic notions
- Source coding
- Channel coding
Online Committee Report, ISIT 2011
Summary
The website has been running smoothly and consistently. We are seeing a stabilization of the number of visits and increased access to the news posted on the website. Detailed figures are provided in the Analytics section.