Professional Documents
Culture Documents
Often we have a random variable X whose pdf and or cdf is known and we define a second
random variable Y = g(X) and we wish to determine its pdf and/or cdf. To demonstrate how this
is done, let us consider the case where g(X) is monotonically increasing or monotonically
decreasing as X goes from -∞ to ∞. This is shown in Figure 2.3 for a monotonically increasing
function.
From Figure 2.3, we see that if x-dx < X < x then y-dy < Y < y. Therefore P(x-dx < X < x) = P(y-
dy < Y < y). If dx and dy are infinitesimally small then this means that
( ) ( ) (2.41)
( ) ( ) (2.42)
Keeping in mind that when g(X) is monotonically decreasing dx/dy is negative, then fY(y) is given
by:
( ) ( )| | (2.43)
( )
Example 2.4
The pdf of X is
( ) {
( )
( ) |
( )
( ) { ,
( ) {
Since there are multiple solutions to x=g-1(y), as seen in Figure 2.4, then
( ) ∑ ( )| | (2.44)
( )
Example 2.5
Solution
Here we see that although -∞ < X < ∞, the random variable Y only has
positive values. Also there are two solutions to g-1(y), g1-1(y) = -√
and g2-1(y) = √ . Therefore
( ) ( √ ) (√ )
√ √
We can extend this to multiple random variables that are transformations of multiple random
variables. For example if we know fXY(x,y) and we are given U = g1(X,Y) and V = g2(X,Y) then
( )
( ) ( )| | (2.46)
( ) ( ) ( )
where g1-1(u,v) is the function that returns x from u and v and g2-1(u,v) is the function that returns
y from u and v and
( )
| | | | (2.47)
( )
Example 2.6
√ (2.48)
And
( ) (2.49)
What is the joint pdf of X and Y and what is the joint pdf of R and ?
Find the marginal pdfs of R and as well.
Solution
( ) (2.50)
( )
| | | | (2.51)
( )
Then
( ) ∫ (2.52)
and
( ) ∫ (2.53)
We can see that R and are independent and that R has a Rayleigh
distribution while is uniformly distributed on the interval 0 to 2.
This is the model used to describe the amplitude gain and phase
response of a signal that encounters frequency nonselective Rayleigh
fading in a wireless channel.
The pdf and the cdf provide us with the information needed to statistically describe a random
variable or a set of random variables. Often, we need to describe a random variable by a set of
statistical averages or mean values.
Averages of the random variable or of its square, for example, can be estimated by observing a
large number of outcomes of the random experiment and averaged over the series of outcomes.
Or we can determine the expected values of these averages using the random variable’s pdf (or
the joint pdf in the case of a set of random variables).
The statistical average, or expected value, of a discrete random variable X is given by:
[ ] ∑ ( ) (2.54)
[ ] ∫ ( ) (2.55)
2.9.2 The Expected Value of a Function of a Random Variable
If we wish to find the mean of Y = g(X), we could find the pdf of Y using (2.44) and then finding
the average of Y by:
[ ] ∫ ( ) (2.56)
[ ] [ ( )] ∫ ( )| |
=∫ ( ) ( ) (2.57)
Example 2.7
Solution
( ) {
Therefore
[ ] ∫ (2.58)
( )
and
[ ] ∫ (2.59)
( )
The expectation of a function of two random variables, g(X,Y), is found using the joint pdf of the
two variables.
[ ( )] ∫ ∫ ( ) ( ) (2.60)
The generalization to the case when there are more than two variables should be obvious.
Example 2.8
( ) {
Find E[XY].
Solution
[ ] ∫ ∫
∫ |
The variance of a random variable gives a measure about how much the random variable varies
around its mean. It is given by:
[( [ ]) ] (2.61)
[ [ ] ( [ ]) ] (2.62)
[ ] [ [ ] ] [( [ ]) ] (2.63)
Example 2.9
Solution
( )
(2.65)
Let X and Y be two random variables with joint pdf fXY(x.y). Let Z = aX+bY where a and b are
constants. Then E[Z] is given by:
[ ] ∫ ∫ ( ) ( )
∫ ∫ ( ) ∫ ∫ ( ) (2.66)
∫ (∫ ( ) ) ∫ (∫ ( ) )
∫ ( ) ( ) (2.67)
And
∫ ( ) ( ) (2.68)
[ ] ∫ ( ) ∫ ( )
(2.69)
[ ] [ ]
This result can be extended to a linear combination of more than two random variables as well.
Therefore the mean of a linear combination or random variables, whether they are independent
or not, is a linear combination of the means of the individual random variables.
2.9.6 The Variance of a Linear combination of Independent Random Variables
Let Z = aX+bY where X and Y are independent random variables and a and b are constants. The
mean and variance of X are X and X2 while the mean and variance of Y are given by Y and Y2.
The variance of Z, Z2 is given by:
[( ) ] ( ) (2.70)
[ ] (2.71)
[ ] [ ] [ ] [ ] (2.72)
And since X and Y are independent, then E[XY]=E[X]E[Y]=XY. Therefore (2.71) becomes
[ ] [ ] (2.73)
Again, we can demonstrate that this result can be extended to more than two random variables.
However, this result is not true if X and Y are dependent.
Exercise 2.6
( ) ( )
( ) {
The characteristic function of a random variable X is the mean of g(X) = ejvX. It is defined as
MX(jv) and it is given by:
( ) [ ] ∫ ( ) (2.74)
Comparing (2.74) to the Fourier transform, we see that MX(jv) is essentially the Fourier
transform of fX(x) with 2f replaced by –v.
Conversely, the pdf of a random variable can be found from its characteristic function by taking
the appropriate inverse transform.
( ) ∫ ( ) (2.75)
Equation (2.75) can be used in cases where it is simpler to obtain the characteristic function of a
random variable and then taking the inverse Fourier transform to find the pdf of that random
variable.
Another use of the characteristic function is to generate the nth moment of the random variable
X. The nth moment of X is defined as E[Xn]. This is achieved by differentiating the
characteristic function.
( )
∫ ( ) (2.76)
( )
| ∫ ( ) [ ] (2.77)
( )
( ) | [ ] (2.78)
Exercise 2.7
The random variable X has pdf fX(x) = e-xu(x). Find its characteristic
function and use it to find E[X], E[X2], E[X3].
2.11 The PDF of the Sum of Two Independent Random Variables
Let X and Y be two independent random variables. Let Z = X+Y. The characteristic function of Z
is given by E[ejv(X+Y)] which is given by:
( ) ( )
[ ] ∫ ∫ ( ) (2.79)
In (2.79) ejv(x+y) = ejvxejvy, and fXY(x,y) = fX(x)fY(y) since X and Y are independent. Therefore (2.79)
becomes:
( )
[ ] ∫ ∫ ( ) ( )
∫ ( ) ∫ ( )
( ) ( ) (2.80)
Since the characteristic function of the pdf of the sum of independent random variables is the
multiplication of the characteristic functions of the individual random variables, and the
characteristic function is the Fourier transform of its pdf, then it follows from (2.80) that the pdf
of Z is
( ) ( ) ( ) (2.81)
Exercise 2.8