Catastrophic Sequence System and State Dependent Decoding

Figure 1 DMS X sends symbol x(n -1) and x(n) to form y(n)

Let us say that for an initial state S0 of the channel, the DMS X emits sequence
xbar = {+1, -1, +1, -1, ...}
Then in time n the output sequence is
ybar = {0, 0, 0, 0, ...}
However for an initial state S1 of the channel with the DMS X emitting sequence
xbar = {-1, +1, -1, +1, ...}
the output sequence at time n is
ybar = {0, 0, 0, 0, ...}
Thus for two different initial states emitting different sequences the output sequences are the same.

The generation of same output sequence for two or more different sequences is called catastrophic sequence system. Thus ambiguous .

In the above example if we knew a priori about the initial state S(n−1) (i.e, either S0 or S1), we can decode into sequence . This is called state dependent decoding.

Since the system has the property of state dependent decoding, the question arises

how much of the information in y̅ is from X?

if not all of the information in is from X, where did the extra information come from?.

Therefore

how do we get this 'something' which is a measure of I(X; Y)?

Example

Figure 2 Consider DMS plus duo-binary channel as information source Y
Let us consider the same DMS and the duo-binary channel system but with above parameters
The state diagram is therefore
Figure 4 State diagram of Fig 2 system
Thus
vector of length 2, mu at n+1 is equal to [0.2, 0.2; 0.8, 0.8] times [mu sub 0 at n; mu sub 1 at n]
and in steady-state
vector of length 2, steady-state of mu at n+1 is equal to [0.2(mu sub 0 + mu sub 1); 0.8(mu sub 0 + mu sub 1)] = [0.2; 0.8]

From the state diagram we also see that

p(y sub 0 Given S sub 0) = 0.2 = f sub 00, p(y sub 1 Given S sub 0) = 0.8 = f sub 01, p(y sub 2 Given S sub 0) = 0 = f sub 02, p(y sub 0 Given S sub 1) = 0 = f sub 10, p(y sub 1 Given S sub 1) = 0.2 = f sub 11 and p(y sub 2 Given S sub 1) = 0.8 = f sub 12
Thus
F = [0.2, 0; 0.8, 0.2; 0 0.8]
and entropy rate
H'(Y) = R = for all i, sum of mu sub i times for all j, sum of (f sub ij times log base 2 (1 over f sub ij)) = 0.721981 is quantitatively identical to H(X)
Refer the above figure to confirm that they are quantitatively identical H'(Y) ≡ H(X).

What does  H'(Y) is identical to H(X) imply?


This means that H'(Y) is tracking  H(x) = H(X) . This therefore implies that the channel is not really losing any information from X. (Thus answering, how much of the information in is from X?)

Being a catastrophic sequence system the input sequence is ambiguous. Our example system also has the property of state dependent decoding. That is, with a priori knowledge of the initial state S(n−1) we can decode to . Thus making unambiguous identification of . The question is therefore,

how to deal with initial state dependency and decoding?

Next:

set–distinguishability decoding (p:5) ➽