<< Chapter < Page | Chapter >> Page > |
At first glance, this appears paradoxical; source coding is used to remove redundancy, whilechannel coding is used to add redundancy. But it is not really self-defeating or contradictorybecause the redundancy that is removed by source coding does not have a structure or pattern that a computeralgorithm at the receiver can exploit to detect or correct errors. The redundancy that is added in channel codingis highly structured, and can be exploited by computer programs implementing the appropriatedecoding routines. Thus [link] begins with a message, and uses a source code to remove the redundancy.This is then coded again by the channel encoder to add structured redundancy, and the resulting signalprovides the input to the transmitter of the previous chapters.One of the triumphs of modern digital communications systems is that, by cleverchoice of source and channel codes, it is possible to get close to the Shannon limits and toutilize all the capacity of a channel.
Like many common English words, information has many meanings. The American Heritage Dictionary catalogs six:
It would clearly be impossible to capture all of these senses in a technical definition that would be useful in transmission systems.The final definition is closest to our needs, though it does not specify exactly how the numerical measure should be calculated.Shannon does. Shannon's insight was that there is a simple relationship between the amount of information conveyed in a messageand the probability of the message being sent. This does not apply directly to “messages” such as sentences, images, or.wav files, but to the symbols of the alphabet that are transmitted.
For instance, suppose that a fair coin has heads on one side and tails on the other. The two outcomes are equally uncertain, and receiving either or removes the same amount of uncertainty (conveys the same amount of information).But suppose the coin is biased. The extreme case is occurs when the probability of is 1. Then, when is received, no information is conveyed, because is the only possible choice! Now suppose that the probability of sending is while the probability of sending is . Then, if is received, it removes a little uncertainty, but not much. is expected, since it usually occurs. But if is received, it is somewhat unusual, and hence conveys a lot of information.In general, events that occur with high probability give little information, while events of low probabilitygive considerable information.
To make this relationship between the probability of events and information more plain, imagine a game in whichyou must guess a word chosen at random from the dictionary. You are given the starting letter as a hint.If the hint is that the first letter is “t,” then this does not narrow down the possibilities very much,since so many words start with “t.” But if the hint is that the first letter is “x,”then there are far fewer choices. The likely letter (the highly probable “t”) conveys little information,while the unlikely letter (the improbable “x”) conveys a lot more information by narrowing down thechoices.
Notification Switch
Would you like to follow the 'Software receiver design' conversation and receive update notifications?