Asymptotic Equipartition Property Assume an information source X passing symbols xi into a symbol register xn = (x0, x1, …, xn-1). The output symbol can therefore be expressed as a set of compound symbol nXC. Therefore • Case 1: If X is a DMS then, each xi are statistically independent. Therefore Since n = # of individual symbols in xn, dividing both sides by −1 ∕ 3 If n → ∞ We know that, for a collection of n symbols (n trials) Thus Therefore when n → ∞ • Case 2: If X is not a DMS, i.e., X has memory then, its Entropy Rate may be defined as Note that if X was a DMS Therefore, in general • Case 3: If X is an Ergodic source, then This is called Shannon-McMillan-Breiman Theorem (S-M-B Th.) (Cover & Thomas, 2006, p. 646-647). Note that AEP definition used but S-M-B Th. uses .

## An Ergodic source Ergodic source are source whose
• Probabilities don't change with time, i.e., stationary.
• For any and all of its statistics,

Time average = Ensemble average

What is the practical consequence of AEP? i.e, Asymptotic Equipartition Property (AEP) makes lossless data compression possible. Lossless data compression is called data compaction.

Reference:

Cover, T. M., & Thomas, J. A. (2006). Lemma 16.8.1. In 2nd Edition, Elements of Information Theory (pp. 646-647). Hoboken, New Jersey: John Wiley & Sons, Inc.

Next:

More on AEP (p:2) ➽