<< Chapter < Page | Chapter >> Page > |
In the previous section, we discussed information sources and quantified information. We also discussed how to represent (andcompress) information sources in binary symbols in an efficient manner. In this section, we consider channels and will find outhow much information can be sent through the channel reliably.
We will first consider simple channels where the input is a discrete random variable and the output is also a discreterandom variable. These discrete channels could represent analog channels with modulation and demodulation and detection.
Let us denote the input sequence to the channel as
The channel output
The statistical properties of a channel are determined if one finds for all and for all . A discrete channel is called a discrete memoryless channel if
A binary symmetric channel (BSC) is a discrete memoryless channel with binary input and binary output and . As an example, a white Gaussian channel with antipodal signaling andmatched filter receiver has probability of error of . Since the error is symmetric with respect to the transmitted bit, then
It is interesting to note that every time a BSC is used one bit is sent across the channel with probability of error of . The question is how much information or how many bits can be sent per channel use,reliably. Before we consider the above question a few definitions are essential. These are discussed in mutual information .
Notification Switch
Would you like to follow the 'Digital communication systems' conversation and receive update notifications?