<< Chapter < Page | Chapter >> Page > |
Communication theory has been formulated best for symbolic-valued signals. ClaudeShannon published in 1948 The Mathematical Theory of Communication , which became the cornerstone of digital communication. He showed the power of probabilistic models for symbolic-valued signals, which allowed him to quantify the information present in a signal. In the simplestsignal model, each symbol can occur at index with a probability , . What this model says is that for each signal value a -sided coin is flipped (note that the coin need not be fair). For this model to make sense, theprobabilities must be numbers between zero and one and must sum to one.
Derive the maximum-entropy results, both the numeric aspect (entropy equals ) and the theoretical one (equally likely symbols maximize entropy). Derive the value of the minimum entropyalphabet.
Equally likely symbols each have a probability of . Thus, . To prove that this is the maximum-entropy probability assignment, we must explicitly take into accountthat probabilities sum to one. Focus on a particular symbol, say the first. appears twice in the entropy formula: the terms and . The derivative with respect to this probability (and all the others) must be zero. The derivative equals , and all other derivatives have the same form (just substitute your letter's index). Thus, eachprobability must equal the others, and we are done. For the minimum entropy answer, one term is , and the others are , which we define to be zero also. The minimum value of entropy is zero.
A four-symbol alphabet has the following probabilities. Note that these probabilities sum to one as they should. As , . The entropy of this alphabet equals
Notification Switch
Would you like to follow the 'Fundamentals of electrical engineering i' conversation and receive update notifications?