This module introduces sums of random variables.
Consider the random variable
formed as the sum of two independent random variables
and
:
where
has pdf
and
has pdf
.
We can write the joint pdf for
and
by rewriting the conditional probability formula:
It is clear that the event '
takes the value
conditional upon
' is equivalent to
taking a value
(since
). Hence
Now
may be obtained using the
Marginal
Probability formula (
this equation from this discussion of
probability density
functions ). Hence
This result may be extended to sums of three or more randomvariables by repeated application of the above arguments for
each new variable in turn. Since convolution is a commutativeoperation, for
independent
variables we get:
An example of this effect occurs when multiple dice are thrown
and the scores are added together. In the 2-dice example of thesubfigures a,b,c of
this figure in the discussion of probability
distributions, we saw how the pmf approximated a triangularshape. This is just the convolution of two uniform 6-point pmfs
for each of the two dice.
Similarly if two variables with Gaussian pdfs are added
together, we shall show in
the discussion of the summation of two or
more Gaussian random variables that this producesanother Gaussian pdf whose variance is the sum of the two input
variances.