Paper award 36842
Basic Notions
- Entropy
- Differential entropy
- Graph entropy
- Conditional entropy
- Mutual infor
Mutual Information
Definitions
Let \(X\) and \(Y\) be discrete random variables defined on finite alphabets \(\mathcal{X}\) \(\mathcal{Y}\), respectively, and with joint probability mass function \(p_{X,Y}\). The mutual information of \(X\) and \(Y\) is the random variable \(I(X,Y)\) defined by
\[ I(X,Y) = \log\frac{p_{X,Y}(X,Y)}{p_X(X)p_Y(Y)}.\]
As with entropy, the base of the logarithm defines the units of mutual information. If the if the logarithm is to the base \(e\), the unit of entropy is the nat.
Entropy
Definitions
\[ H(X)=\log\frac{1}{p_X(X)}.\]
The base of the logarithm defines the unit of entropy. If the logarithm is to the base 2, the unit of entropy is the bit. If the if the logarithm is to the base \(e\), the unit of entropy is the nat.
Information Theory Knowledge Database
- Basic notions
- Source coding
- Channel coding
Online Committee Report, ISIT 2011
Summary
The website has been running smoothly and consistently. We are seeing a stabilization of the number of visits and increased access to the news posted on the website. Detailed figures are provided in the Analytics section.
Online Committee Report, ITW 2011
Summary
The website has been running smoothly and consistently. We are currently reviewing a set of upgrades that will be pushed to the main website in the next two weeks. As usual, some figures regarding website usage are provided in the Analytics section.
The three main topics of this report are
Online Committee Report, ITA 2012
Summary
The website has been running smoothly and consistently. A new set of features were released to the website two weeks ago and we working with SixFeetUp to develop an alternative solution to mailing lists. As usual, some figures regarding website usage are provided in the Analytics section.
The two main topics of this report are