<< Chapter < Page Chapter >> Page >

The previous two paragraphs show the tradeoff between signal-to-noise ratio andbandwidth. To maintain the same probability of error, larger bandwidth allows smaller SNR; larger SNRallows the use of a narrower frequency band. Quantifying this tradeoff was one of Shannon's greatestcontributions.

While the details of a formal proof of the channel capacity are complex, the result is believable when thought ofin terms of the relationship between the distance between the levels in a source alphabet and the average amount of noisethat the system can tolerate. A digital signal with N levels has a maximum information rate C = log ( N ) T , where T is the time interval between transmitted symbols. C is the capacity of the channel, and has units of bits per second. This can be expressedin terms of the bandwidth B of the channel by recalling Nyquist's sampling theorem,which says that a maximum of 2 B pulses per second can pass through the channel.Thus the capacity can be rewritten

C = 2 B log ( N ) bits per second.

To include the effect of noise, observe that the power of the received signal is S + P (where S is the power of the signal and P is the power of the noise). Accordingly, the average amplitude of the received signalis S + P and the average amplitude of the noise is P . The average distance d between levels is twice the average amplitude divided by the number of levels (minus one),and so d = 2 S + P N - 1 . Many errors will occur in the transmissionunless the distance between the signal levels is separated by at least twice the average amplitude of the noise, that is, unless

d = 2 S + P N - 1 > 2 P .

Rearranging this implies that N - 1 must be no larger than S + P P . The actual bound (as Shannon shows) is that N S + P P , and using this value gives

C = 2 B log S + P P = B log 1 + S P

bits per second.

Observe that, if either the bandwidth or the SNR is increased, so is the channel capacity. For white noise, as the bandwidthincreases, the power in the noise increases, the SNR decreases, and so the channel capacity doesnot become infinite. For a fixed channel capacity, it is easy to trade off bandwidth against SNR. For example, suppose a capacityof 1000 bits per second is required. Using a bandwidth of 1 KHz, we find that the signal and the noise can be of equal power.As the allowed bandwidth is decreased, the ratio S P increases rapidly:

Bandwidth S P 1000 Hz 111 1 1 500 Hz 111 3 1 250 Hz 11 15 1 125 Hz 1 255 1 100 Hz 1023

Shannon's result can now be stated succinctly. Suppose that there is a source producing informationat a rate of R bits per second and a channel of capacity C . If R < C (where C is defined as in [link] ) then there exists a way to represent (or code) the data so that it can be transmitted witharbitrarily small error. Otherwise, the probability of error is strictly positive.

This is tantalizing and frustrating at the same time. The channel capacity defines the ultimate goalbeyond which transmission systems cannot go, yet it provides no recipe for how to achieve the goal.The next sections describe various methods of representing or coding the data that assist inapproaching this limit in practice.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Software receiver design. OpenStax CNX. Aug 13, 2013 Download for free at http://cnx.org/content/col11510/1.3
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Software receiver design' conversation and receive update notifications?

Ask