Mechanics of Information Loss

Let us consider the case of two independent information sources, X and η (noise), such that:

  1. Source X emits symbols xi ∈ {−1, +1} with symbol probability p(X) = {0.5, 0.5} and entropy H(N) = 1.
  2. Noise source N emits symbols η ∈ {−1, 0, +1} with symbol probability p(N) = {0.1, 0.8, 0.1} and entropy H(N) = 0.9212928.
The emitted symbols are joined, emitting summation symbol Y.
Information with Noise

Since sources X and N are independent of each other,

p(X, N) = p(X) p(N) ≠ 0

Therefore,

Table for Information with Noise

And,

H(Y) = 1.821928

Hence,

Labelled Information with Noise

Also, combined symbols CX, N with ordered symbols ⟨xi, ηj⟩ has probability p(xi, ηj) = p(xi) p(ηj).

Probability table for Information with Noise

Therefore, information after receiving combined symbol is given by,

H(C) = H(X) + H(N|X)
Since X and N are statistically independent, H(N|X) = H(N). Thus
H(C) = H(X) + H(N) = 1 + 0.9212928 = 1.9212928

This means:

  1. H(Y) < H(C) = H(X, N)
  2. There is less information in Y than in the cartesian product or set XN, but more information in Y than information in either X alone or N alone.
  3. Noise information has been added to information.

Next:

Summary (p:6) ➽