The integral transform character of these entities implies that there is essentially a
one-to-one relationship between the transform and the distribution.
Moments
The name and some of the importance of the moment generating function arise from the
fact that the derivatives of
M
X evaluateed at
are the moments about the origin. Specifically
Since expectation is an integral and because of the regularity of the integrand,
we may differentiate inside the integral with respect to the parameter.
Upon setting
, we have
. Repeated differentiation gives
the general result. The corresponding result for the characteristic function is
.
The generating function does not lend itself readily to computing moments, except that
For higher order moments, we may convert the generating function to the moment generating function
by replacing
s with
e
s , then work with
M
X and its derivatives.
The poisson
Distribution
, so that
We convert to
M
X by replacing
s with
e
s to get
. Then
so that
These results agree, of course, with those found by direct computation with the distribution.
We refer to the following as
operational properties .
If
, then
For the moment generating function, this pattern follows from
Similar arguments hold for the other two.
If the pair
is independent, then
For the moment generating function,
and
form an independent pair for
each value of the parameter
s . By the product rule for expectation
Similar arguments are used for the other two transforms.
A partial converse for (T2) is as follows:
If
, then the pair
is uncorrelated.
To show this, we obtain two expressions for
, one by direct expansion and
use of linearity, and the other by taking the second derivative of the moment generatingfunction.
On setting
and using the fact that
, we have
which implies the equality
.
Note that we have
not shown that being uncorrelated implies the product rule.
We utilize these properties in determining the moment generating and generating functions
for several of our common distributions.
Some discrete distributions
Indicator function
Simple random variable
(primitive form)
Binomial
.
We use the product rule for sums of independent random variables and the generating
function for the indicator function.
Geometric
.
We use the formula for the geometric series to get
Negative binomial
If
Y
m is the number of the trial in a Bernoulli sequence on which the
m th success occurs, and
is the number of failures before the
m th success, then
The power series expansion about
shows that
Hence
Comparison with the moment generating function for the geometric distribution shows that
has the same distribution as the sum of
m iid random variables, each
geometric
. This suggests that the sequence is characterized by independent, successive
waiting times to success. This also shows that the expectation and variance of
X
m are
m times the expectation and variance for the geometric. Thus
Poisson
In
[link] , above, we establish
and
.
If
is an independent pair, with
Poisson
and
Poisson
,
then
Poisson
. Follows from (T1) and product of exponentials.