<< Chapter < Page | Chapter >> Page > |
Estimates for identical parameters are heavily dependent on the assumed underlying probability densities. To understand thissensitivity better, consider the following variety of problems, each of which asks for estimates of quantitiesrelated to variance. Determine the bias and consistency in each case.
Compute the maximum
Find the maximum
Find the maximum likelihood estimate of the variance of identically distributed, but dependent Gaussian random variables. Here, the covariance matrix is written , where the normalized covariance matrix has trace
Imagine yourself idly standing on the corner in a large city when you note the serial number of a passing beer truck.Because you are idle, you wish to estimate (guess may be more accurate here) how many beer trucks the city has fromthis single operation
Making appropriate assumptions, the beer truck's number is drawn from a uniform probability density ranging betweenzero and some unknown upper limit, find the maximum likelihood estimate of the upper limit.
Show that this estimate is biased.
In one of your extraordinarily idle moments, you observe throughout the city beer trucks. Assuming them to be independent observations, now what is the maximum likelihood estimateof the total?
Is this estimate of biased? asymptotically biased? consistent?
We make observations of a parameter corrupted by additive noise ( ). The parameter is a Gaussian random variable [ ] and are statistically independent Gaussian random variables [ ].
Find the MMSE estimate of .
Find the maximum
Compute the resulting mean-squared error for each estimate.
Consider an alternate procedure based on the same observations . Using the MMSE criterion, we estimate immediately after each observation. This procedure yieldsthe sequence of estimates , ,, . Express as a function of , , and . Here, denotes the variance of the estimation error of the estimate. Show that
Although the maximum likelihood estimation procedure was not clearly defined until early in the 20th century, Gaussshowed in 1905 that the Gaussian density
What equation defines the maximum likelihood estimate of the mean when the common probability density function of the data has the form ?
The sample average is, of course, . Show that it minimizes the mean-square error .
Equating the sample average to , combine this equation with the maximum likelihood equation to show that the Gaussian densityuniquely satisfies the equations.
Notification Switch
Would you like to follow the 'Statistical signal processing' conversation and receive update notifications?