<< Chapter < Page Chapter >> Page >

Nonnegative random variables

There is a nondecreasing sequence of nonnegative simple random variables converging to X . Monotonicity implies the integrals of the nondecreasing sequence is a nondecreasing sequenceof real numbers, which must have a limit or increase without bound (in which case we say the limit is infinite).We define E [ X ] = lim n E [ X n ] .

Two questions arise.

  1. Is the limit unique? The approximating sequences for a simple random variable are not unique, although their limit is the same.
  2. Is the definition consistent? If the limit random variable X is simple, does the new definition coincide with the old?

The fundamental lemma and monotone convergence may be used to show that the answer to both questions is affirmative, so that the definition is reasonable. Also, the six fundamentalproperties survive the passage to the limit.

As a simple applications of these ideas, consider discrete random variables such as the geometric ( p ) or Poisson ( μ ) , which are integer-valued but unbounded.

Unbounded, nonnegative, integer-valued random variables

The random variable X may be expressed

X = k = 0 k I E k , where E k = { X = k } with P ( E k ) = p k

Let

X n = k = 0 n - 1 k I E k + n I B n , where B n = { X n }

Then each X n is a simple random variable with X n X n + 1 . If X ( ω ) = k , then X n ( ω ) = k = X ( ω ) for all n k + 1 . Hence, X n ( ω ) X ( ω ) for all ω . By monotone convergence, E [ X n ] E [ X ] . Now

E [ X n ] = k = 1 n - 1 k P ( E k ) + n P ( B n )

If k = 0 k P ( E k ) < , then

0 n P ( B n ) = n k = n P ( E k ) k = n k P ( E k ) 0 as n

Hence

E [ X ] = lim n E [ X n ] = k = 0 k P ( A k )
Got questions? Get instant answers now!

We may use this result to establish the expectation for the geometric and Poisson distributions.

X Geometric ( p )

We have p k = P ( X = k ) = q k p , 0 k . By the result of [link]

E [ X ] = k = 0 k p q k = p q k = 1 k q k - 1 = p q ( 1 - q ) 2 = q / p

For Y - 1 geometric ( p ) , p k = p q k - 1 so that E [ Y ] = 1 q E [ X ] = 1 / p

Got questions? Get instant answers now!

X Poisson ( μ )

We have p k = e - μ μ k k ! . By the result of [link]

E [ X ] = e - μ k = 0 k μ k k ! = μ e - μ k = 1 μ k - 1 ( k - 1 ) ! = μ e - μ e μ = μ
Got questions? Get instant answers now!

The general case

We make use of the fact that X = X + - X - , where both X + and X - are nonnegative. Then

E [ X ] = E [ X + ] - E [ X - ] provided at least one of E [ X + ] , E [ X - ] is finite

Definition . If both E [ X + ] and E [ X - ] are finite, X is said to be integrable .

The term integrable comes from the relation of expectation to the abstract Lebesgue integral of measure theory.

Again, the basic properties survive the extension. The property (E0) is subsumed in a more general uniqueness property noted in the list of properties discussed below.

Theoretical note

The development of expectation sketched above is exactly the development of the Lebesgue integral of the random variable X as a measurable function on the basic probability space ( Ω , F , P ) , so that

E [ X ] = Ω X d P

As a consequence, we may utilize the properties of the general Lebesgue integral. In its abstract form, it is not particularly useful for actual calculations. A carefuluse of the mapping of probability mass to the real line by random variable X produces a corresponding mapping of the integral on the basic space to an integral on the realline. Although this integral is also a Lebesgue integral it agrees with the ordinary Riemann integral of calculus when the latter exists,so that ordinary integrals may be used to compute expectations.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Applied probability. OpenStax CNX. Aug 31, 2009 Download for free at http://cnx.org/content/col10708/1.6
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Applied probability' conversation and receive update notifications?

Ask