<< Chapter < Page Chapter >> Page >
This module covers probability density estimation of signals assuming a knowledge of the first order amplitude distribution of observed signals. It describes Type and Histogram Estimators as well as density verification of the estimates.

Probability density estimation

Many signal processing algorithms, implicitly or explicitly, assume that the signal and the observation noise are each welldescribed as Gaussian random sequences. Virtually all linear estimation and prediction filters minimize the mean-squarederror while not explicitly assuming any form for the amplitude distribution of the signal or noise. In many formal waveformestimation theories where probability density is, for better or worse, specified, the mean-squared error arises fromGaussian assumptions. A similar situation occurs explicitly in detection theory. The matched filter is probably the optimumdetection rule only when the observation noise is Gaussian. When the noise is non-Gaussian, thedetector assumes some other form. Much of what has been presented in this chapter is based implicitly on a Gaussian model for both the signal and the noise. When non-Gaussian distributions areassumed, the quantities upon which optimal linear filtering theory are based, covariance functions, no longer suffice tocharacterize the observations. While the joint amplitude distribution of any zero-mean, stationary Gaussian stochasticprocess is entirely characterized by its covariance function; non-Gaussian processes require more. Optimal linear filteringresults can be applied in non-Gaussian problems, but we should realize that other informative aspects of the process arebeing ignored.

This discussion would seem to be leading to a formulation of optimal filtering in a non-Gaussian setting. Would that suchtheories were easy to use; virtually all of them require knowledge of process characteristics that are difficult tomeasure and the resulting filters are typically nonlinear [Lipster and Shiryayev: Chapter 8] Rather than present preliminary results, we take the tack that knowledge is better than ignorance: At least thefirst-order amplitude distribution of the observed signals should be considered during the signal processing design. Ifthe signal is found to be Gaussian, then linear filtering results can be applied with the knowledge than no otherfiltering strategy will yield better results. If non-Gaussian, the linear filtering can still be used and theengineer must be aware that future systems might yield "better" results. Note that linear filtering optimizes the mean-squared error whether the signalsinvolved are Gaussian or not. Other error criteria might better capture unexpected changes in signal characteristicsand non-Gaussian processes contain internal statistical structure beyond that described by the covariancefunction.

Types

When the observations are discrete-valued or made so by digital-to-analog converters, estimating the probability massfunction is straightforward: Count the relative number of times each value occurs. Let

    r 0 r L 1
denote a sequence of observations, each of which takes on 𝒜 a 1 a N . This set is known as an alphabet and each a n is a letter in that alphabet. We estimate the probability that an observation equals one of the letters according to P r a n 1 L l 0 L 1 I r l a n where I · is the indicator function, equaling one if its argument is true and zero otherwise. This kind of estimate is known ininformation theory as a type [Cover and Thomas: Chapter 12] , and types have remarkable properties. For example, if theobservations are statistically independent, the probability that a given sequence occurs equals r r 0 r L 1 l 0 L 1 P r r l Evaluating the logarithm, we find that r P r r l Converting to a sum over letters reveals
r n 0 N 1 L P r a n P r a n L n 0 N 1 P r a n P r a n P r a n P r a n L P r P r P r
which yields
r L P r P r P r
We introduce the entropy [Cover and Thomas: §2.1] and Kullback-Leibler distance [See Stein's Lemma ]. P n 0 N 1 P a n P a n P 1 P 0 n 0 N 1 P 1 a n P 1 a n P 0 a n Because the Kullback-Leibler distance is non-negative, equaling zero only when the two probability distributions equal each other, we maximize [link] with respect to P by choosing P P : The type estimator is the maximum likelihood estimator of P r .

Questions & Answers

what is defense mechanism
Chinaza Reply
what is defense mechanisms
Chinaza
I'm interested in biological psychology and cognitive psychology
Tanya Reply
what does preconceived mean
sammie Reply
physiological Psychology
Nwosu Reply
How can I develope my cognitive domain
Amanyire Reply
why is communication effective
Dakolo Reply
Communication is effective because it allows individuals to share ideas, thoughts, and information with others.
effective communication can lead to improved outcomes in various settings, including personal relationships, business environments, and educational settings. By communicating effectively, individuals can negotiate effectively, solve problems collaboratively, and work towards common goals.
it starts up serve and return practice/assessments.it helps find voice talking therapy also assessments through relaxed conversation.
miss
Every time someone flushes a toilet in the apartment building, the person begins to jumb back automatically after hearing the flush, before the water temperature changes. Identify the types of learning, if it is classical conditioning identify the NS, UCS, CS and CR. If it is operant conditioning, identify the type of consequence positive reinforcement, negative reinforcement or punishment
Wekolamo Reply
please i need answer
Wekolamo
because it helps many people around the world to understand how to interact with other people and understand them well, for example at work (job).
Manix Reply
Agreed 👍 There are many parts of our brains and behaviors, we really need to get to know. Blessings for everyone and happy Sunday!
ARC
A child is a member of community not society elucidate ?
JESSY Reply
Isn't practices worldwide, be it psychology, be it science. isn't much just a false belief of control over something the mind cannot truly comprehend?
Simon Reply
compare and contrast skinner's perspective on personality development on freud
namakula Reply
Skinner skipped the whole unconscious phenomenon and rather emphasized on classical conditioning
war
explain how nature and nurture affect the development and later the productivity of an individual.
Amesalu Reply
nature is an hereditary factor while nurture is an environmental factor which constitute an individual personality. so if an individual's parent has a deviant behavior and was also brought up in an deviant environment, observation of the behavior and the inborn trait we make the individual deviant.
Samuel
I am taking this course because I am hoping that I could somehow learn more about my chosen field of interest and due to the fact that being a PsyD really ignites my passion as an individual the more I hope to learn about developing and literally explore the complexity of my critical thinking skills
Zyryn Reply
good👍
Jonathan
and having a good philosophy of the world is like a sandwich and a peanut butter 👍
Jonathan
generally amnesi how long yrs memory loss
Kelu Reply
interpersonal relationships
Abdulfatai Reply
Got questions? Join the online conversation and get instant answers!
Jobilize.com Reply

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Statistical signal processing. OpenStax CNX. Dec 05, 2011 Download for free at http://cnx.org/content/col11382/1.1
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Statistical signal processing' conversation and receive update notifications?

Ask