- Concept of symbol, compound symbol and word.
- Joint probability and the concept of joint entropy.
- Conditional entropy or Equivocation measures the uncertainty remaining in a set of symbols (Y).
0 ≤ H(Y|X) ≤ H(Y)
- Chain rule for entropy:
H(C) = H(X) + H(Y|X)
- Heart of entropy algebra:
H(X) + H(Y|X) = H(Y) + H(X|Y)
- Mutual information is the reduction in uncertainty in a set of symbols, given the knowledge of another set of symbols.
- Mutual information is reciprocal: I(X;Y) = I(Y;X) and its possible range is:
0 ≤ I(X;Y) ≤ H(X)
- With minimum uncertainty, H(X|Y) = 0, the mutual information is maximum I(X;Y) = H(X)
and,With maximum uncertainty, H(X|Y) = H(X), mutual information is minimum I(X;Y) = 0.
- In ideal communication system:
I(X;Y) = H(X) − H(X|Y) = H(X)
maximum mutual information.
- In non-ideal communication system:
I(X;Y) = H(X) − H(X|Y) ≤ H(X)
information is lost in transmission.