<< Chapter < Page | Chapter >> Page > |
Before diving into a more complex statistical analysis of random signals and processes , let us quickly review the idea of correlation . Recall that the correlation of two signals or variables is the expectedvalue of the product of those two variables. Since our focus will be to discover more about a random process, a collection ofrandom signals, then imagine us dealing with two samples of a random process, where each sample is taken at a different pointin time. Also recall that the key property of these random processes is that they are now functions of time; imagine themas a collection of signals. The expected value of the product of these two variables (or samples) will now depend on how quickly theychange in regards to time . For example, if the two variables are taken from almost the same time period,then we should expect them to have a high correlation. We will now look at a correlation function that relates a pair of randomvariables from the same process to the time separations between them, where the argument to this correlation function will bethe time difference. For the correlation of signals from two different random process, look at the crosscorrelation function .
The first of these correlation functions we will discuss is the autocorrelation , where each of the random variables we will deal with come from the same random process.
Below we will look at several properties of the autocorrelation function that hold for stationary random processes.
Sometimes the whole random process is not available to us. In these cases, we would still like to be able to find outsome of the characteristics of the stationary random process, even if we just have part of one sample function.In order to do this we can estimate the autocorrelation from a given interval, to seconds, of the sample function.
Below we will look at a variety of examples that use the autocorrelation function. We will begin with a simple exampledealing with Gaussian White Noise (GWN) and a few basic statistical properties that will prove very useful in theseand future calculations.
We will let represent our GWN. For this problem, it is important to remember the following fact about the mean of aGWN function:
Along with being zero-mean , recall that GWN is always independent . With these two facts, we are now ready to do the short calculationsrequired to find the autocorrelation. Since the function, , is independent, then we can take the product of the individual expected values of both functions. Now, looking at the above equation we see that we can break it up further into two conditions: one when and are equal and one when they are not equal. When they are equal we can combine theexpected values. We are left with the following piecewise function to solve: We can now solve the two parts of the above equation. Thefirst equation is easy to solve as we have already stated that the expected value of will be zero. For the second part, you should recall from statistics that the expected value of the squareof a function is equal to the variance. Thus we get the following results for the autocorrelation: Or in a more concise way, we can represent the results as
Notification Switch
Would you like to follow the 'Pdf generation problem modules' conversation and receive update notifications?