<< Chapter < Page | Chapter >> Page > |
For i.i.d. sources, , which means that the divergence increases linearly with . Not only does the divergence increase, but it does so by a constant per symbol.Therefore, based on typical sequence concepts that we have seen, for an generated by , its probability under vanishes. However, we can construct a distribution whose divergence with both abd is small,
We now have for ,
On the other hand, [link] , and so
By symmetry, we see that is also close to in the divergence sense.
Intuitively, it might seem peculiar that is close to both and but they are far away from each other (in divergence terms). This intuition stems from the triangle inequality, which holds for all metrics. The contradiction is resolved by realizingthat the divergence is not a metric, and it does not satisfy the triangle inequality.
Note also that for two i.i.d. distributions and , the divergence
is linear in . If were i.i.d., then must also be linear in . But the divergence is not increasing linearly in , it is upper bounded by 1. Therefore, we conclude that is not an i.i.d. distribution. Instead, is a distribution that contains memory, and there is dependence in between collections of different symbols of in the sense that they are either all drawn from or all drawn from . To take this one step further, consider sources with
then in an analogous manner to before it can be shown that
Sources with memory : Instead of the memoryless (i.i.d.) source,
let us now put forward a statistical model with memory,
Stationary source : To understand the notion of a stationary source, consider an infinite stream of symbols, . A complete probabilistic description of a stationary distribution is given by the collection of allmarginal distribution of the following form for all and ,
For a stationary source, this distribution is independent of .
Entropy rate : We have defined the first order entropy of an i.i.d. random variable [link] , and let us discuss more advanced concepts for sources with memory.Such definitions appear in many standard textbooks, for example that by Gallager [link] .
The entropy rate also satisfies .
Theorem 3 For a stationary source with bounded first order entropy, , the following hold.
Notification Switch
Would you like to follow the 'Universal algorithms in signal processing and communications' conversation and receive update notifications?