<< Chapter < Page | Chapter >> Page > |
We note that for each i and for each j
Hence, we may write
Now and are simple if X and Y are, so that with the aid of [link] we have
If are simple, then so are , and . It follows that
By an inductive argument, this pattern may be extended to a linear combination of any finite number of simple random variables. Thus we may assert
Linearity . The expectation of a linear combination of a finite number of simple random variables is that linear combination of the expectations of the individual random variables.
—
Expectation of a simple random variable in affine form
As a direct consequence of linearity, whenever simple random variable X is in affine form, then
Thus, the defining expression holds for any affine combination of indicator functions , whether in canonical form or not.
This random variable appears as the number of successes in n Bernoulli trials with probability p of success on each component trial. It is naturally expressed in affine form
Alternately, in canonical form
so that
Some algebraic tricks may be used to show that the second form sums to , but there is no need of that. The computation for the affine form is much simpler.
A bettor places three bets at $2.00 each. The first bet pays $10.00 with probability 0.15, the second pays $8.00 with probability 0.20, and the third pays $20.00 with probability 0.10.What is the expected gain?
SOLUTION
The net gain may be expressed
Then
These calculations may be done in MATLAB as follows:
c = [10 8 20 -6];p = [0.15 0.20 0.10 1.00]; % Constant a = aI_(Omega), with P(Omega) = 1
E = c*p'E = -0.9000
Functions of simple random variables
If X is in a primitive form (including canonical form) and g is a real function defined on the range of X , then
so that
Alternately, we may use csort to determine the distribution for Z and work with that distribution.
Caution . If X is in affine form (but not a primitive form)
so that
Suppose X in a primitive form is
with probabilities .
Let . Determine .
c = [-3 -1 2 -3 4 -1 1 2 3 2]; % Original coefficientspc = 0.01*[8 11 6 13 5 8 12 7 14 16]; % Probabilities for C_jG = c.^2 + 2*c % g(c_j)
G = 3 -1 8 3 24 -1 3 8 15 8EG = G*pc' % Direct computation
EG = 6.4200[Z,PZ] = csort(G,pc); % Distribution for Z = g(X)disp([Z;PZ]') % Optional display-1.0000 0.1900
3.0000 0.33008.0000 0.2900
15.0000 0.140024.0000 0.0500
EZ = Z*PZ' % E[Z]from distribution for Z
EZ = 6.4200
A similar approach can be made to a function of a pair of simple random variables, provided the joint distribution is available. Suppose and (both in canonical form). Then
The form a partition, so Z is in a primitive form. We have the same two alternative possibilities: (1) direct calculation from values of and corresponding probabilities , or (2) use of csort to obtain the distribution for Z .
We use the joint distribution in file jdemo1.m and let . To set up for calculations, we use jcalc.
% file jdemo1.m
X = [-2.37 -1.93 -0.47 -0.11 0 0.57 1.22 2.15 2.97 3.74];
Y = [-3.06 -1.44 -1.21 0.07 0.88 1.77 2.01 2.84];
P = 0.0001*[ 53 8 167 170 184 18 67 122 18 12;11 13 143 221 241 153 87 125 122 185;
165 129 226 185 89 215 40 77 93 187;165 163 205 64 60 66 118 239 67 201;
227 2 128 12 238 106 218 120 222 30;93 93 22 179 175 186 221 65 129 4;
126 16 159 80 183 116 15 22 113 167;198 101 101 154 158 58 220 230 228 211];
jdemo1 % Call for data
jcalc % Set upEnter JOINT PROBABILITIES (as on the plane) P
Enter row matrix of VALUES of X XEnter row matrix of VALUES of Y Y
Use array operations on matrices X, Y, PX, PY, t, u, and PG = t.^2 + 2*t.*u - 3*u; % Calculation of matrix of [g(t_i, u_j)]
EG = total(G.*P) % Direct calculation of expectationEG = 3.2529
[Z,PZ]= csort(G,P); % Determination of distribution for Z
EZ = Z*PZ' % E[Z]from distribution
EZ = 3.2529
Notification Switch
Would you like to follow the 'Applied probability' conversation and receive update notifications?