Determining information gained or lost

Let us illustrate the determination of information gained or lost by considering the example:

completed directed graph

1) Compute entropy H(X)

We know that information (entropy) is given by

H(X) = for all x sub i, sum of products p(x sub i) times log of base 2 of (1 over p(x sub i))
Thus,
H(X) = 0.970950

2) Compute mutual information I(X; Y)

We know that the partial mutual information I(xi; Y) is given by

I(x sub i; Y) = for all y sub j in Y, sum of products p(Y given x sub i) times p(x sub i) times log of base 2 of (p(Y given x sub i) over p(Y))
Hence,
I(X; Y) = for all x sub i in X, sum of I(x sub i, Y)
Therefore,
I(x sub 0; Y) = 0.444843 I(x sub 1; Y) = 0.368574
Hence,
I(X; Y) = I(x0; Y) + I(x1; Y)
= 0.813417

3) Compute information loss Since I(X; Y) = H(X) − H(X|Y), after output Y, the information loss from the original symbol X is given by
H(X|Y) = H(X) − I(X; Y)
= 0.157533
Thus, H(X|Y) ≠ 0 and we can say that there is information loss through the information channel.

Next:

Hard and Soft Decisions (p:4) ➽