<< Chapter < Page | Chapter >> Page > |
Application: Quantization In many application, such as inference in wireless communication and such as in quantization insignal processing, a commonly accepted model of the effect of a source of errors on the signal is the so-called additive white noise model :
For a quantization with precision the error is the quantification error and it is determined by the signal itself. [One could argue, that is not random since it is completely determined once is known. Still, the model is useful since we can usually not predict from observing .]
For quantization using rounding
(matlab:
round
The error
is in this case uniformly
distributed on the interval
. )
it can be shown that the noise power per sampleamounts to
.
When using one more bit for quantization, then the error
is half as
large, thus the power 4 times smaller, which amounts to roughly -6 dB. In otherwords, the power of the quantization noise is proportional to -6dB times the number
of bits used.
Using the analog of Parseval's equation and recalling that is the power spectrum (analog of the square of the Fourier transform) we have indeed:
For samples of noise taken over a time interval of length we have, thus, approximatively
Note that the FFT increases power by . The relation [link] becomes exact when taking expected values . The approximation improves the larger is, since the left side is an estimator of the variance , which is for quantization with precision .
Application: Interference A further example of a situation where an additive white noise proves useful is wireless transmissionof a binary signal under interference. Here, bits may flip from 0 to 1 and vice versa since detection is not perfect. The error is here determined by the interfering signal, and the configuration of the decoder. Note that the possible values of each is either 0 (no flip) or 1 (flip). Without specific information on the interference, the chance of a flip is independentof the time , and independent of the past occurrence of flips. Thus, white noise is a very reasonable model for . Clearly, the probability will be close to 1 if only little inference is present and will decrease the stronger the inference.
Gaussian noise As an important special case we mention Gaussian white noise, where the common distribution of the is Gaussian, or “normal”: This model assumption is standard whenevernothing is known on the distribution. It makes sense, e.g., as model for an overall error which is composed of several small unknownerrors. (compare Central Limit Theorem)
Colored Noise A sequence is called colored noise if its terms are random and possess a relation or dependence between them. Consequently, the powerspectrum of colored noise is not flat, but possesses certain prevalent frequencies — hence the name “colored” (recall thatthe frequencies of light waves correspond to colors).
One of the most simple ways to produce colored noise is to filter white noise. For instance, is colored since the entries are no longer independent: and contain the same number as an additive term. Similarly, is colored.
Adopting a continuous-time notation (for convenience) we write
with Fourier transform
and similarly
and power spectrum The same formula for and could be obtained by computing it as the Fourier transform of the auto-correlation (see [link] ); indeed, for : , and for all other ; for the same except . , (use that ) and )
where is the total power of the original noise and . Note that neither nor are flat. Verification via matlab is easy.
Notification Switch
Would you like to follow the 'Sampling rate conversion' conversation and receive update notifications?