<< Chapter < Page | Chapter >> Page > |
You have already encountered the Moment Generating Function of a pdf in the Part IB probability course. This function was closely related to the Laplace Transform of the pdf.
Now we introduce the Characteristic Function for a random variable, which is closely related to the Fourier Transform of the pdf.
In the same way that Fourier Transforms allow easy manipulation of signals when they are convolved with linear system impulseresponses, Characteristic Functions allow easy manipulation of convolved pdfs when they represent sums of random processes.
The Characteristic Function of a pdf is defined as:
Note that whenever is a valid pdf,
Properties of Fourier Transforms apply with substituted for . In particular:
The Gaussian or normal distribution is very important, largely because of the Central Limit Theorem which we shall prove below. Because of this (and as part of the proofof this theorem) we shall show here that a Gaussian pdf has a Gaussian characteristic function too.
A Gaussian distribution with mean and variance has pdf:
Thus the characteristic function of a Gaussian pdf is also Gaussian in magnitude, , with standard deviation , and with a linear phase rotation term, , whose rate of rotation equals the mean of the pdf. This coincides with standard results from Fourier analysis of Gaussianwaveforms and their spectra (e.g. Fourier transform of a Gaussian waveform with time shift).
If two variables, and , with Gaussian pdfs are summed to produce , their characteristic functions will be multiplied together (equivalent toconvolving their pdfs) to give
Further Gaussian variables can be added and the pdf will remain Gaussian with further terms added to the aboveexpressions for the combined mean and variance.
The central limit theorem states broadly that if a large number of independent random variables of arbitrary pdf, but with equal variance and zero mean, are summed together and scaled by to keep the total energy independent of , then the pdf of the resulting variable will tend to a zero-mean Gaussian with variance as tends to infinity.
This result is obvious from the previous result if the input pdfs are also Gaussian , but it is the fact that it applies for arbitrary input pdfs that is remarkable, and is the reason for the importance of the Gaussian (or normal) pdf. Noise generated in nature isnearly always the result of summing many tiny random processes (e.g. noise from electron energy transitions in a resistor ortransistor, or from distant worldwide thunder storms at a radio antenna) and hence tends to a Gaussian pdf.
Although for simplicity, we shall prove the result only forthe case when all the summed processes have the same variance and pdfs, the central limit result is more general than this and applies in many caseseven when the variance and pdfs are not all the same.
Let ( to ) be the independent random processes, each will zero mean and variance , which are combined to give
Substituting these moments into and and using the series expansion, + (terms of order or smaller), gives
Hence we may now infer from , and that the pdf of as will be given by
shows an example of convergence when the input pdfs are uniform, and is gradually increased from to . By , convergence is good, and this is how some 'Gaussian' random generator functions operate - by summingtypically uncorrelated random numbers with uniform pdfs.
For some less smooth or more skewed pdfs, convergence can be slower, as shown for a highly skewed triangular pdf in ; and pdfs of discrete processes are particularly problematic in this respect, asillustrated in .
Notification Switch
Would you like to follow the 'Random processes' conversation and receive update notifications?