Gallagher's diagram and Entropy rate

Gallagher's diagram

An alternative to the Markov process diagram is the Gallagher's diagram. The diagram represents the source and channel using state-dependent memoryless channel diagrams.

For our above example, below-left shows the Gallagher's diagram is at time index n−1 the state is S0 while the right hand side shows the diagram if at n−1 the state is S1.

Figure 1 Gallagher's diagram of the channel with memory example

Note that the output y(n) and state S(n) [or S(n) for short] updates simultaneously. That is

y sup (n) is not equal to f(S sup (n))
but
1 over n = f(S sup (n-1))

From Gallagher's diagram we have

conditional probability function for y and S at time n Given the probability of x at n and S at n-1 IS EQUAL TO conditional probability function for y at time n Given the probability of x at n and S at n-1 AND conditional probability function for S at time n Given the probability of x at n and S at n-1

In other words,

p(y sup n, S sup (n) Given x sup n, S sup (n-1)) = p(y sup n Given x sup n, S sup (n-1)) times q(S sup (n) Given x sup n, S sup (n-1))

Since

y sup (n) is not equal to f(S sup (n))
but
y sup (n) is equal to f(S sup (n-1))
we can therefore say that y(n) and S(n) are state independent.

Entropy rate

Let index i = 0 or 1 but fixed. Then for a fixed S(n−1) we can get the conditional entropy H(Y | S(n−1) = Si). That is

H(Y given S sup (n-1) = S sub i) = for all y, sum of (p(y Given S sub i) times log base 2 (1 over p(y Given S sub i)))
Recall that the elements of the F matrix are fij = p(yj | Si). Thus
H(Y given S sup (n-1) = S sub i) = for all y, sum of (f sub ij times log base 2 (f sub ij))
Also recall that for a fixed Si, the elements fij goes down the respective column in F. That is
F = [column for S sub 0, column for S sub 1, ..., column for S sub i, ..., column for S sub k]
In our example system
F = [0.5, 0; 0.5, 0.5; 0 0.5]

Hence, for fixed state Si = S0

H(Y given S sup (n-1) = S sub 0) = (0.5 times log base 2 (0.5)) plus (0.5 times log base 2 (0.5)) plus (0 times log base 2 (0)) = 1
and for fixed state Si = S1
H(Y given S sup (n-1) = S sub 1) = (0 times log base 2 (0) plus (0.5 times log base 2 (0.5)) plus (0.5 times log base 2 (0.5)) = 1
Note that the above computed values of unity are labels for arrows seen in the Gallagher's diagram.

Following determination of the conditional probability H(Y | S(n−1) = Si), the steady-state entropy rate of the output sequence is therefore

R = for i = 0 to 2-1, sum of (mu sub i times H(Y Given S sub i)) = 1
In general, the entropy rate is given by
H'(Y) = R = for all i, sum of mu sub i times for all j, sum of (f sub ij times log base 2 (1 over f sub ij))

Next:

Catastrophic Sequence System and State Dependent Decoding (p:4) ➽