<< Chapter < Page Chapter >> Page >

Random variable X has moment generating function

M X ( s ) = exp ( 3 ( e s - 1 ) ) 1 - 5 s exp ( 16 s 2 / 2 + 3 s )

By recognizing forms and using rules of combinations, determine E [ X ] and Var [ X ] .

X = X 1 + X 2 + X 3 , with X 1 Poisson ( 3 ) , X 2 exponential (1/5) , X 3 N ( 3 , 16 )
E [ X ] = 3 + 5 + 3 = 11 Var [ X ] = 3 + 25 + 16 = 44
Got questions? Get instant answers now!

Suppose the class { A , B , C } of events is independent, with respective probabilities 0.3, 0.5, 0.2. Consider

X = - 3 I A + 2 I B + 4 I C
  1. Determine the moment generating functions for I A , I B , I C and use properties of moment generating functions to determine the moment generating function for X .
  2. Use the moment generating function to determine the distribution for X .
  3. Use canonic to determine the distribution. Compare with result (b).
  4. Use distributions for the separate terms; determine the distribution for the sum with mgsum3. Compare with result (b).
M X ( s ) = ( 0 . 7 + 0 . 3 e - 3 s ) ( 0 . 5 + 0 . 5 e 2 s ) ( 0 . 8 + 0 . 2 e 4 s ) =
0 . 12 e - 3 s + 0 . 12 e - s + 0 . 28 + 0 . 03 e s + 0 . 28 e 2 s + 0 . 03 e 3 s + 0 . 07 e 4 s + 0 . 07 e 6 s

The distribution is

X = [ - 3 - 1 0 1 2 3 4 6 ] P X = [ 0 . 12 0 . 12 0 . 28 0 . 03 0 . 28 0 . 03 0 . 07 0 . 07 ]
c = [-3 2 4 0];P = 0.1*[3 5 2];canonic Enter row vector of coefficients cEnter row vector of minterm probabilities minprob(P) Use row matrices X and PX for calculationsCall for XDBN to view the distribution P1 = [0.7 0.3]; P2 = [0.5 0.5]; P3 = [0.8 0.2]; X1 = [0 -3]; X2 = [0 2]; X3 = [0 4]; [x,px]= mgsum3(X1,X2,X3,P1,P2,P3); disp([X;PX;x;px]') -3.0000 0.1200 -3.0000 0.1200-1.0000 0.1200 -1.0000 0.1200 0 0.2800 0 0.28001.0000 0.0300 1.0000 0.0300 2.0000 0.2800 2.0000 0.28003.0000 0.0300 3.0000 0.0300 4.0000 0.0700 4.0000 0.07006.0000 0.0700 6.0000 0.0700
Got questions? Get instant answers now!

Suppose the pair { X , Y } is independent, with both X and Y binomial. Use generating functions to show under what condition, if any, X + Y is binomial.

Binomial iff both have same p , as shown below.

g X + Y ( s ) = ( q 1 + p 1 s ) n ( q 2 + p 2 s ) m = ( q + p s ) n + m iff p 1 = p 2
Got questions? Get instant answers now!

Suppose the pair { X , Y } is independent, with both X and Y Poisson.

  1. Use generating functions to show under what condition X + Y is Poisson.
  2. What about X - Y ? Justify your answer.

Always Poisson, as the argument below shows.

g X + Y ( s ) = e μ ( s - 1 ) e ν ( s - 1 ) = e ( μ + ν ) ( s - 1 )

However, Y - X could have negative values.

Got questions? Get instant answers now!

Suppose the pair { X , Y } is independent, Y is nonnegative integer-valued, X is Poisson and X + Y is Poisson. Use the generating functions to show that Y is Poisson.

E [ X + Y ] = μ + ν , where ν = E [ Y ] > 0 . g X ( s ) = e μ ( s - 1 ) and g X + Y ( s ) = g X ( s ) g Y ( s ) = e ( μ + ν ) ( s - 1 ) . Division by g X ( s ) gives g Y ( s ) = e ν ( s - 1 ) .

Got questions? Get instant answers now!

Suppose the pair { X , Y } is iid, binomial ( 6 , 0 . 51 ) . By the result of [link]

X + Y is binomial. Use mgsum to obtain the distribution for Z = 2 X + 4 Y . Does Z have the binomial distribution? Is the result surprising? Examine the first few possible values for Z . Write the generating function for Z ; does it have the form for the binomial distribution?

x = 0:6; px = ibinom(6,0.51,x);[Z,PZ] = mgsum(2*x,4*x,px,px);disp([Z(1:5);PZ(1:5)]')0 0.0002 % Cannot be binomial, since odd values missing 2.0000 0.00124.0000 0.0043 6.0000 0.01188.0000 0.0259 - - - - - - - -
g X ( s ) = g Y ( s ) = ( 0 . 49 + 0 . 51 s ) 6 g Z ( s ) = ( 0 . 49 + 0 . 51 s 2 ) 6 ( 0 . 49 + 0 . 51 s 4 ) 6
Got questions? Get instant answers now!

Suppose the pair { X , Y } is independent, with X binomial ( 5 , 0 . 33 ) and

Y binomial ( 7 , 0 . 47 ) .

Let G = g ( X ) = 3 X 2 - 2 X and H = h ( Y ) = 2 Y 2 + Y + 3 .

  1. Use the mgsum to obtain the distribution for G + H .
  2. Use icalc and csort to obtain the distribution for G + H and compare with the result of part (a).
X = 0:5; Y = 0:7;PX = ibinom(5,0.33,X); PY = ibinom(7,0.47,Y);G = 3*X.^2 - 2*X; H = 2*Y.^2 + Y + 3;[Z,PZ] = mgsum(G,H,PX,PY);icalc Enter row matrix of X-values XEnter row matrix of Y-values Y Enter X probabilities PXEnter Y probabilities PY Use array operations on matrices X, Y, PX, PY, t, u, and PM = 3*t.^2 - 2*t + 2*u.^2 + u + 3; [z,pz]= csort(M,P); e = max(abs(pz - PZ)) % Comparison of p valuese = 0
Got questions? Get instant answers now!

Suppose the pair { X , Y } is independent, with X binomial ( 8 , 0 . 39 ) and

Y uniform on { - 1 . 3 , - 0 . 5 , 1 . 3 , 2 . 2 , 3 . 5 } . Let

U = 3 X 2 - 2 X + 1 and V = Y 3 + 2 Y - 3
  1. Use mgsum to obtain the distribution for U + V .
  2. Use icalc and csort to obtain the distribution for U + V and compare with the result of part (a).
X = 0:8; Y = [-1.3 -0.5 1.3 2.2 3.5]; PX = ibinom(8,0.39,X);PY = (1/5)*ones(1,5); U = 3*X.^2 - 2*X + 1;V = Y.^3 + 2*Y - 3; [Z,PZ]= mgsum(U,V,PX,PY); icalcEnter row matrix of X-values X Enter row matrix of Y-values YEnter X probabilities PX Enter Y probabilities PYUse array operations on matrices X, Y, PX, PY, t, u, and P M = 3*t.^2 - 2*t + 1 + u.^3 + 2*u - 3;[z,pz] = csort(M,P);e = max(abs(pz - PZ)) e = 0
Got questions? Get instant answers now!

If X is a nonnegative integer-valued random variable, express the generating function as a power series.

  1. Show that the k th derivative at s = 1 is
    g X ( k ) ( 1 ) = E [ X ( X - 1 ) ( X - 2 ) ( X - k + 1 ) ]
  2. Use this to show the Var [ X ] = g X ' ' ( 1 ) + g X ' ( 1 ) - [ g X ' ( 1 ) ] 2 .

Since power series may be differentiated term by term

g X ( n ) ( s ) = k = n k ( k - 1 ) ( k - n + 1 ) p k s k - n so that
g X ( n ) ( 1 ) = k = n k ( k - 1 ) ( k - n + 1 ) p k = E [ X ( X - 1 ) ( X - n + 1 ) ]
Var [ X ] = E [ X 2 ] - E 2 [ X ] = E [ X ( X - 1 ) ] + E [ X ] - E 2 [ X ] = g X ' ' ( 1 ) + g X ' ( 1 ) - [ g X ' ( 1 ) ] 2
Got questions? Get instant answers now!

Let M X ( ) be the moment generating function for X .

  1. Show that Var [ X ] is the second derivative of e - s μ M X ( s ) evaluated at s = 0 .
  2. Use this fact to show that if X N ( μ , σ 2 ) , then Var [ X ] = σ 2 .
f ( s ) = e - s μ M X ( s ) f ' ' ( s ) = e - s μ [ - μ M X ' ( s ) + μ 2 M X ( s ) + M X ' ' ( s ) - μ M X ' ( s ) ]

Setting s = 0 and using the result on moments gives

f ' ' ( 0 ) = - μ 2 + μ 2 + E [ X 2 ] - μ 2 = Var [ X ]
Got questions? Get instant answers now!

Use derivatives of M X m ( s ) to obtain the mean and variance of the negative binomial ( m , p ) distribution.

To simplify writing use f ( s ) for M X ( S ) .

f ( s ) = p m ( 1 - q e s ) m f ' ( s ) = m p m q e s ( 1 - q e s ) m + 1 f ' ' ( s ) = m p m q e s ( 1 - q e s ) m + 1 + m ( m + 1 ) p m q 2 e 2 s ( 1 - q e s ) m + 2
E [ X ] = m p m q ( 1 - q ) m + 1 = m q p E [ X 2 ] = m q p + m ( m + 1 ) p m q 2 ( 1 - q ) m + 2
Var [ X ] = m q p + m ( m + 1 ) q 2 p 2 - m 2 q 2 p 2 = m q p 2
Got questions? Get instant answers now!

Use moment generating functions to show that variances add for the sum or difference of independent random variables.

To simplify writing, set f ( s ) = M X ( s ) , g ( s ) = M Y ( s ) , and h ( s ) = M X ( s ) M Y ( s )

h ' ( s ) = f ' ( s ) g ( s ) + f ( s ) g ' ( s ) h ' ' ( s ) = f ' ' ( s ) g ( s ) + f ' ( s ) g ' ( s ) + f ' ( s ) g ' ( s ) + f ( s ) g ' ' ( s )

Setting s = 0 yields

E [ X + Y ] = E [ X ] + E [ Y ] E [ ( X + Y ) 2 ] = E [ X 2 ] + 2 E [ X ] E [ Y ] + E [ Y 2 ] E 2 [ X + Y ] =
E 2 [ X ] + 2 E [ X ] E [ Y ] + E 2 [ Y ]

Taking the difference gives Var [ X + Y ] = Var [ X ] + Var [ Y ] . A similar treatment with g ( s ) replaced by g ( - s ) shows Var [ X - Y ] = Var [ X ] + Var [ Y ] .

Got questions? Get instant answers now!

The pair { X , Y } is iid N ( 3 , 5 ) . Use the moment generating function to show that Z = 3 X - 2 Y + 3 is is normal (see Example 3 from "Transform Methods" for general result).

M 3 X ( s ) = M X ( 3 s ) = exp 9 5 s 2 2 + 3 3 s M - 2 Y ( s ) = M Y ( - 2 s ) = exp 4 5 s 2 2 - 2 3 s
M Z ( s ) = e 3 s exp ( 45 + 20 ) s 2 2 + ( 9 - 6 ) s = exp 65 s 2 2 + 6 s
Got questions? Get instant answers now!

Use the central limit theorem to show that for large enough sample size (usually 20 or more), the sample average

A n = 1 n i = 1 n X i

is approximately N ( μ , σ 2 / n ) for any reasonable population distribution having mean value μ and variance σ 2 .

E [ A n ] = 1 n i = 1 n μ = μ Var [ A n ] = 1 n 2 i = 1 n σ 2 = σ 2 n

By the central limit theorem, A n is approximately normal, with the mean and variance above.

Got questions? Get instant answers now!

A population has standard deviation approximately three. It is desired to determine the sample size n needed to ensure that with probability 0.95 the sample average will be within 0.5 of the mean value.

  1. Use the Chebyshev inequality to estimate the needed sample size.
  2. Use the normal approximation to estimate n (see Example 1 from "Simple Random Samples and Statistics").
  • P | A n - μ | σ / n 0 . 5 n 3 3 2 0 . 5 2 n 0 . 05 implies n 720
  • Use of the table in Example 1 from "Simple Random Samples and Statistics" shows
    n ( 3 / 0 . 5 ) 2 3 . 84 = 128
Got questions? Get instant answers now!

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Applied probability. OpenStax CNX. Aug 31, 2009 Download for free at http://cnx.org/content/col10708/1.6
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Applied probability' conversation and receive update notifications?

Ask