<< Chapter < Page | Chapter >> Page > |
As the block length becomes larger, more error correction will be needed. Do codes exist that can correct all errors? Perhaps the crowning achievement of ClaudeShannon's creation of information theory answers this question. His result comes in two complementary forms: theNoisy Channel Coding Theorem and its converse.
Let denote the efficiency of an error-correcting code: the ratio of the number of data bitsto the total number of bits used to represent them. If the efficiency is less than the capacity of the digital channel, an error-correcting code exists that has theproperty that as the length of the code increases, the probability of an error occurring in the decoded blockapproaches zero.
If , the probability of an error in a decoded block must approach one regardless of the code that might be chosen.
This result astounded communication engineers when Shannon published it in 1948. Analog communication always yields anoisy version of the transmitted signal; in digital communication, error correction can be powerful enough tocorrect all errors as the block length increases. The key for this capability to exist is that the code's efficiency beless than the channel's capacity. For a binary symmetric channel, the capacity is given by
Notification Switch
Would you like to follow the 'Fundamentals of electrical engineering i' conversation and receive update notifications?