<< Chapter < Page | Chapter >> Page > |
Determining a
sufficient
statistic directly from the definition can be a tedious
process. The following result can simplify this process byallowing one to spot a sufficient statistic directly from the
functional form of the density or mass function.
Let
be the density or mass function for the random vector
, parametrized by the vector
. The statistic
is sufficient for
if and only if there exist functions
(not depending on
) and
such that
for all possible values of
.Fisher-neyman factorization theorem
Suppose are IID, . Denote . Then
The next example illustrates the appliction of the theorem to a continuous random variable.
Consider a normally distributed random sample , IID, where is unknown. The joint pdf of is We would like to rewrite is the form of , where . At this point we require a trick-one that is commonly used when manipulating normal densities, and worthremembering. Define , the sample mean. Then
First, suppose is sufficient for . By definition, is independent of . Let denote the joint density or mass function for . Observe . Then
Suppose the probability mass function for can be written where . The probability mass function for is obtained by summing over all such that :
The following exercises provide additional examples where the Fisher-Neyman factorization may be used toidentify sufficient statistics.
Suppose are independent and uniformly distributed on the interval . Find a sufficient statistic for .
Suppose are independent measurements of a Poisson random variable with intensity parameter :
Find a sufficient statistic for .
What is the conditional probability mass function of , given , where ?
Consider , IID, where and are both unknown. Find a sufficient statistic for .
Notification Switch
Would you like to follow the 'Pdf generation problem modules' conversation and receive update notifications?