Let us consider the case of two independent information sources, X and η (noise), such that:
- Source X emits symbols x_{i} ∈ {−1, +1} with symbol probability p(X) = {0.5, 0.5} and entropy H(N) = 1.
- Noise source N emits symbols η ∈ {−1, 0, +1} with symbol probability p(N) = {0.1, 0.8, 0.1} and entropy H(N) = 0.9212928.
❶
Since sources X and N are independent of each other,
Therefore,
And,
Hence,
Also, combined symbols C_{X, N} with ordered symbols ⟨x_{i}, η_{j}⟩ has probability p(x_{i}, η_{j}) = p(x_{i}) p(η_{j}).
❷
Therefore, information after receiving combined symbol is given by,
- H(Y) < H(C) = H(X, N)
- There is less information in Y than in the cartesian product or set X ✕ N, but more information in Y than information in either X alone or N alone.
- Noise information has been added to information.
Next:
Summary (p:6) ➽