<< Chapter < Page | Chapter >> Page > |
Once the generation of the uniform random variable is established, it can be used to generate other types of random variables.
THEOREM I
Let X have a continuous distribution , so that exists for (and is hopefully countable). Then the random variable has distribution , U is uniformly distributed on (0,1).
PROOF
Because is monotone. Thus,
The last step follows because U is uniformly distributed on (0,1). Diagrammatically, we have that if and only if , an event of probability .
As long as we can invert the distribution function to get the inverse distribution function , the theorem assures us we can start with a pseudo-random uniform variable U and turn into a random variable , which has the required distribution .
The Exponential Distribution
Consider the exponential distribution defined as
Then f or the inverse distribution function we have
Thus if U is uniformly distributed on 0 to 1, then has the distribution of an exponential random variable with parameter λ. We say, for convenience, that X is exponential (λ).
Normal and Gamma Distributions
For both these cases there is no simple functional form for the inverse distribution , but because of the importance of the Normal and Gamma distribution models, a great deal of effort has been expended in deriving good approximations.
The Normal distribution is defined through its density,
So that,
The normal distribution function is also often denoted , when the parameter u and σ are set to 0 to 1, respectively. The distribution has no closed-form inverse, , but the inverse is needed do often that , like logarithms or exponentials, is a system function.
The inverse of the Gamma distribution function, which is given by
Is more difficult to compute because its shape changes radically with the value of k . It is however available on most computers as a numerically reliable function.
The Normal and Gamma Distributions
A commonly used symmetric distribution, which has a shape very much like that of the Normal distribution, is the standardized logistic distribution.
with probability density function
The inverse is obtained by setting Then, or
Therefore,
And the random variable is generated, using the inverse probability integral method. As follows
Let X have a discrete distribution that is, jumps at points . Usually we have the case that , so that X is an integer value.
Let the probability function be denoted by
The probability distribution function is then,
and the reliability or survivor function is
The survivor function is sometimes easier to work with than the distribution function, and in fields such as reliability, it is habitually used. The inverse probability integral transform method of generating discrete random variables is based on the following theorem.
THEOREM
Let U be uniformly distributed in the interval (0,1). Set whenever , for with . Then X has probability function .
PROOF
By definition of the procedure,
if and only if .
Therefore,
By the definition of the distribution function of a uniform (0,1) random variable.
Thus the inverse probability integral transform algorithm for generating X is to find such that and and then set .
In the discrete case, there is never any problem of numerically computing the inverse distribution function, but the search to find the values and between which U lies can be time-consuming, generally, sophisticated search procedures are required. In implementing this procedure, we try to minimize the number of times one compares U to . If we want to generate many of X , and is not easily computable, we may also want to store for all k rather than recomputed it. Then we have to worry about minimizing the total memory to store values of .
The Binary Random Variable
To generate a binary-valued random variable X that is 1 with probability p and 0 with probability 1- p , the algorithm is:
The Discrete Uniform Random Variable
Let X take on integer values between and including the integers a and b , where , with equal probabilities. Since there are distinct values for X , the probability of getting any one of these values is, by definition, . If we start with a continuous uniform (0,1) random number U , then the discrete inverse probability integral transform shows that
X = integer part of .
The Geometric Distribution
Let X take values on zero and the positive integers with a geometric distribution. Thus,
and
To generate geometrically distributed random variables then, you can proceed successively according to the following algorithm:
Notification Switch
Would you like to follow the 'Introduction to statistics' conversation and receive update notifications?