Let us consider channels with memory. In other words channels that are statistically independent, p(x)p(y) ≠ 0. For example assume a DMS (discrete memoryless source X = {−1, +1}) sending symbols to a delay (D) such that their sum is the output,
x at t = n − 1 |
x at t = n | |
---|---|---|
−1 | +1 | |
−1 | −2 | 0 |
+1 | 0 | +2 |
❶
Hidden Markov Process
If DMS and the duo–binary channel is considered an information source
then the Markov process that exist within this information source is called Hidden Markov Process.
In steady–state, analysis of the system shows that
(Note that the notation in the equation is x(n) = x^{(n)}. See above figure for the parametric values used for the above computations)
❷Drawing a state diagram
Let us say that the channel at time index n sends symbol x_{i} where i ∈ {0, 1}. Let us also say that this channel has H(X) = 1. Thus p(x_{0}) = p(x_{1}) = 0.5.
At any time index n if the symbol is x_{i} = −1 (i.e, x_{i} = x_{0}) let us define the 'state' as S = S_{0}. On the other hand, if the symbol is x_{i} = +1 (i.e, x_{i} = x_{1}) let us say its state is S = S_{1}. Then the Markov–Process state diagram is
The diagram shows that if at state S_{0}, for output y = −2 the symbol for next time index n is x_{i} = x_{0}. Since it is a statistically independent system . Thus the arrow is directed back to the same state. On the other hand, for output (assuming that the initial state is S_{0}) the next symbol must be x_{i} = x_{1}. Hence . The arrows is therefore directed from states S_{0} to S_{1}. From the state-diagram one can make further analysis.
Let
Hence for the above example
Continuing the analysis with the above example the state probability at steady-state (ss) is derived from the limit