<< Chapter < Page | Chapter >> Page > |
The previous two paragraphs show the tradeoff between signal-to-noise ratio andbandwidth. To maintain the same probability of error, larger bandwidth allows smaller SNR; larger SNRallows the use of a narrower frequency band. Quantifying this tradeoff was one of Shannon's greatestcontributions.
While the details of a formal proof of the channel capacity are complex, the result is believable when thought ofin terms of the relationship between the distance between the levels in a source alphabet and the average amount of noisethat the system can tolerate. A digital signal with levels has a maximum information rate , where is the time interval between transmitted symbols. is the capacity of the channel, and has units of bits per second. This can be expressedin terms of the bandwidth of the channel by recalling Nyquist's sampling theorem,which says that a maximum of pulses per second can pass through the channel.Thus the capacity can be rewritten
To include the effect of noise, observe that the power of the received signal is (where is the power of the signal and is the power of the noise). Accordingly, the average amplitude of the received signalis and the average amplitude of the noise is . The average distance between levels is twice the average amplitude divided by the number of levels (minus one),and so . Many errors will occur in the transmissionunless the distance between the signal levels is separated by at least twice the average amplitude of the noise, that is, unless
Rearranging this implies that must be no larger than . The actual bound (as Shannon shows) is that , and using this value gives
bits per second.
Observe that, if either the bandwidth or the SNR is increased, so is the channel capacity. For white noise, as the bandwidthincreases, the power in the noise increases, the SNR decreases, and so the channel capacity doesnot become infinite. For a fixed channel capacity, it is easy to trade off bandwidth against SNR. For example, suppose a capacityof 1000 bits per second is required. Using a bandwidth of 1 KHz, we find that the signal and the noise can be of equal power.As the allowed bandwidth is decreased, the ratio increases rapidly:
Shannon's result can now be stated succinctly. Suppose that there is a source producing informationat a rate of bits per second and a channel of capacity . If (where is defined as in [link] ) then there exists a way to represent (or code) the data so that it can be transmitted witharbitrarily small error. Otherwise, the probability of error is strictly positive.
This is tantalizing and frustrating at the same time. The channel capacity defines the ultimate goalbeyond which transmission systems cannot go, yet it provides no recipe for how to achieve the goal.The next sections describe various methods of representing or coding the data that assist inapproaching this limit in practice.
Notification Switch
Would you like to follow the 'Software receiver design' conversation and receive update notifications?