<< Chapter < Page | Chapter >> Page > |
The self information gives the information in a single outcome. In most cases, e.g in data compression, it is much moreinteresting to know the average information content of a source. This average is given by the expected value of the self information with respect to the source's probabilitydistribution. This average of self information is called the source entropy.
In texts you will find that the argument to the entropy function may vary. The two most common are and . We calculate the entropy of a source X, but the entropy is,strictly speaking, a function of the source's probabilty function p. So both notations are justified.
Most calculators does not allow you to directly calculate the logarithm with base 2, so we have to use a logarithm base that mostcalculators support. Fortunately it is easy to convert between different bases.
Assume you want to calculate , where . Then implies that . Taking the natural logarithm on both sides we obtain
When throwing a dice, one may ask for the average information conveyed in a single throw. Using the formula for entropy we get
If a soure produces binary information with probabilities and . The entropy of the source is
An analog source is modeled as a continuous-time random process with power spectral density bandlimited to the bandbetween 0 and 4000 Hz. The signal is sampled at the Nyquist rate. The sequence of random variables, as a result ofsampling, are assumed to be independent. The samples are quantized to 5 levels . The probability of the samples taking the quantized values are , respectively. The entropy of the random variables are
Entropy is closely tied to source coding. The extent to which a source can be compressed is related to its entropy.There are many interpretations possible for the entropy of a random variable, including
Notification Switch
Would you like to follow the 'Information and signal theory' conversation and receive update notifications?