You are on page 1of 95

Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Lecture 5: Mathematical Expectation

Assist. Prof. Dr. Emel YAVUZ DUMAN

MCB1007 Introduction to Probability and Statistics


İstanbul Kültür University
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Introduction

A very important concept in probability and statistics is that of the


mathematical expectation, expected value, or briefly the
expectation, of a random variable. Originally, the concept of a
mathematical expectation arose in connection with games of
chance, and in its simplest form it is the product of the amount a
player stands to win and the probability that he or she will win. For
instance, if we hold one of 10, 000 tickets in a raffle for which the
grand prize is a trip worth $4, 800, our mathematical expectation is
1
4, 800 · = $0.48.
10, 000
This amount will have to be interpreted in the sense of an average
- altogether the 10, 000 tickets pay $4, 800, or on the average
$4,800
10,000 = $0.48 per ticket.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

The Expected Value of a Random Variable

Definition 1
If X is a discrete random variable and f (x) is the value of its
probability distribution at x, the expected value of X is

E (X ) = xf (x).
x

Correspondingly, if X is a continuous random variable and f (x) is


the value of its probability density at x, the expected value of X is
 ∞
E (X ) = xf (x)dx.
−∞

In this definition it is assumed, of course, that the sum or the


integral exists; otherwise, the mathematical expectation is
undefined.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 2
Imagine a game in which, on any play, a player has a 20% chance
of winning $3 and an 80% chance of losing $1. The probability
distribution function of the random variable X , the amount won or
lost on a single play is

x $3 −$1
f (x) 0.2 0.8

and so the average amount won (actually lost, since it is negative)


- in the long run - is

E (X ) = ($3)(0.2) + (−$1)(0.8) = −$0.20.

What does “in the long run” mean? If you play, are you
guaranteed to lose no more than 20 cents?
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution. If you play and lose, you are guaranteed to lose $1. An
expected loss of 20 cents means that if you played the game over
and over and over and over · · · again, the average of your $3
winnings and your $1 losses would be a 20 cent loss. “In the long
run” means that you can’t draw conclusions about one or two
plays, but rather thousands and thousands of plays.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 3
A lot of 12 television sets includes 2 with white cords. If three of
the sets are chosen at random for shipment to a hotel, how many
sets with white cords can the shipper expect to send to the hotel?

Solution. Since x of the two sets with  cords and 3 − x of


2 white
10
the 10 other sets can be chosen in x 3−x ways, three of the 12
  12
sets can be chosen in 123 ways, and these 3 possibilities are
presumably equiprobable, we find that the probability distribution
of X , the number of sets with white cords shipped to the hotel, is
given by    2 10
x
f (x) = 123−x
 for x = 0, 1, 2
3
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

or, in tabular form,

x 0 1 2
f (x) 6/11 9/22 1/22

Now,


2
6 9 1 1
E (X ) = xf (x) = 0 · +1· +2· =
11 22 22 2
x=0

and since half a set cannot possibly be shipped, it should be clear


that the term “expect” is not used in its colloquial sense. Indeed,
it should be interpreted as an average pertaining to repeated
shipments made under the given conditions.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 4
A saleswoman has been offered a new job with a fixed salary of
$290. Her records from her present job show that her weekly
commissions have the following probabilities:

commission 0 $100 $200 $300 $400


probability 0.05 0.15 0.25 0.45 0.1

Should she change jobs?

Solution. The expected value (expected income) from her present


job is

(0 × $0.05) + (0.15 × $100) + (0.25 × $200)


+ (0.45 × $300) + (0.1 × $400) = $240.

If she is making a decision only on the amount of money earned,


then clearly she ought to change jobs.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 5
Dan, Eric, and Nick decide to play a fun game. Each player puts
$2 on the table and secretly writes down either heads or tails.
Then one of them tosses a fair coin. The $6 on the table is divided
evenly among the players who correctly predicted the outcome of
the coin toss. If everyone guessed incorrectly, then everyone takes
their money back. After many repetitions of this game, Dan has
lost a lot of money - more than can be explained by bad luck.
What’s going on?

Solution. A tree diagram for this problem is worked out below,


under the assumptions that everyone guesses correctly with
probability 1/2 and everyone is correct independently.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Nick right? Dan’s payoff Probability

Eric right? Y
$0 1/8
1/2
Y
1/2
1/2 N $1 1/8
Dan right?
1/2 Y $1 1/8
Y N 1/2
1/2
1/2
N $4 1/8

Y −$2 1/8
1/2
1/2
N Y 1/2
1/2 N −$2 1/8

1/2 Y −$2 1/8


1/2
N
1/2
$0 1/8
N
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

In the payoff column, we are accounting for the fact that Dan has
to put in $2 just to play. So, for example, if he guesses correctly
and Eric and Nick are wrong, then he takes all $6 on the table, but
his net profit is only $4. Working from the tree diagram, Dan’s
expected payoff is:
1 1 1 1 1 1
E (payoff) =0 · + 1 · + 1 · + 4 · + (−2) · + (−2) ·
8 8 8 8 8 8
1 1
+ (−2) · + 0 ·
8 8
=0.

So the game perfectly fair! Over time, he should neither win nor
lose money.

A game is called fair if the expected value of the game is


zero.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Let say that Nick and Eric are collaborating; in particular, they
always make opposite guesses. So our assumption everyone is
correct independently is wrong; actually the events that Nick is
correct and Eric is correct are mutually exclusive! As a result, Dan
can never win all the money on the table. When he guesses
correctly, he always has to split his winnings with someone else.
This lowers his overall expectation, as the corrected tree diagram
below shows:
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Nick right? Dan’s payoff Probability

Eric right? Y
$0 0
0
Y
1
1/2 N $1 1/4
Dan right?
1/2 Y $1 1/4
Y N 1
1/2
0
N $4 0

Y −$2 0
0
1/2
N Y 1
1/2 N −$2 1/4

1/2 Y −$2 1/4


1
N
0
$0 0
N
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

From this revised tree diagram, we can work out Dan’s actual
expected payoff
1 1 1
E (payoff) =0 · 0 + 1 · + 1 · + 4 · 0 + (−2) · 0 + (−2) ·
4 4 4
1
+ (−2) · + 0 · 0
4
1
=− .
2
So he loses an average of a half-dollar per game!
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 6
Consider the following game in which two players each spin a fair
wheel in turn

Player A Player B

4 5
2
12 6

The player that spins the higher number wins, and the loser has to
pay the winner the difference of the two numbers (in dollars). Who
would win more often? What is that player’s expected winnings?
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution. Assuming 3 equally likely outcomes for spinning A’s


wheel, and 2 equally likely outcomes for B’s wheel, there are then
3 × 2 = 6 possible equally-likely outcomes for this game:
Sample Space
H
HH A
HH 2 4 12
B H
5 (2, 5) (4, 5) (12, 5)
6 (2, 6) (4, 6) (12, 6)
We see here that player A wins in 2/6 games, and player B wins in
4/6 games, so player B wins the most often. But, let’s compute
the expected winnings for player B:
Winnings, X , for Player B
HH A
HH 2 4 12
B HH
5 $3 $1 −$7
6 $4 $2 −$6
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Each of these possible amounts won has the same probability


(1/6). The expected winnings for player B is then simply:

1 1 1 1 1 1
E (X ) = $3 · + $4 · + $1 · + $2 · + (−$7) · + (−$6) ·
6 6 6 6 6 6
= −$0.50.

Hence, we see that even though player B is more likely to win on a


single game, his expected winnings is actually negative. If he keeps
on playing the game, he will eventually start losing money!
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 7
Find the expected value of the random variable X whose
probability density is given by


⎨x for 0 < x < 1,
f (x) = 2 − x for 1 ≤ x < 2,


0 elsewhere.

Solution.
 ∞
E (X ) = xf (x)dx
−∞
 0  1  2  ∞
= x f (x) dx + x f (x) dx + x f (x) dx + x f (x) dx
−∞
0
1
0

0 x 2−x 0
1  2
x3 
 + x3   = 1 + 22 − 1 23
= x2 − − 1 + = 1.
3 0 3 
1 3 3 3
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 8

Certain coded measurements of the pitch diameter of threads of a


fitting have the probability density

4
π(1+x 2 )
for 0 < x < 1,
f (x) =
0 elsewhere.

Find the expected value of this random variable.

Solution.
 ∞  1 
4dx 2 1 2xdx
E (X ) = xf (x)dx = x· = ·
−∞ 0 π(1 + x 2 ) π 0 (1 + x 2 )
2 1 2 ln 4
= ln(x 2 + 1)0 = (ln 2 − ln 1) = ≡ 0.4412712003.
π π π
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 9
If X is a discrete random variable and f (x) is the value of its
probability distribution at x, the expected value of g (X ) is given by

E [g (X )] = g (x)f (x).
x

Correspondingly, if X is a continuous random variable and f (x) is


the value of its probability density at x, the expected value of
g (X ) is given by
 ∞
E [g (X )] = g (x)f (x)dx.
−∞
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 10
Roulette, introduced in Paris in 1765, is a very popular casino
game. Suppose that Hannah’s House of Gambling has a roulette
wheel containing 38 numbers: zero (0), double zero (00), and the
numbers 1, 2, 3, · · · , 36. Let X denote the number on which the
ball lands and u(x) denote the amount of money paid to the
gambler, such that:


⎪ $5 if x = 0,


⎨$10 if x = 00,
u(x) =
⎪$1
⎪ if x is odd,


⎩$2 if x is even.

How much would Hannah’s House of Gambling has to charge each


gambler to play in order to ensure that Hannah’s House of
Gambling made some money?
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution. Assuming that the ball has an equally likely chance of


landing on each number, the probability distribution function of X
is
f (x) = 1/38
for each x = 0, 00, 1, 2, 3, · · · , 36. Therefore, the expected value of
u(X ) is

E (u(X )) = u(x)f (x)
x
  
1 1
=$5 + $10
38 38
       
1 1
+ $1 × 18 + $2 × 18
38 38
= $1.82
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Note that the 18 that is multiplied by the $1 and $2 is because


there are 18 odd and 18 even numbers on the wheel. Our
calculation tells us that, in the long run, Hannah’s House of
Gambling would expect to have to pay out $1.82 for each spin of
the roulette wheel. Therefore, in order to ensure that the House
made money, the House would have to charge at least $1.82 per
play.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 11
If X is the number of points rolled with a balanced die, find the
expected value of g (X ) = 2X 2 + 1.

Solution. Since each possible outcome has the probability 16 , we


get


6 
6
1
E [g (X )] =E [2X 2 + 1] = g (x)f (x) = (2x 2 + 1)
6
x=1 x=1
1 1 1
=(2 · 12 + 1) + (2 · 22 + 1) + · · · + (2 · 62 + 1)
6 6 6
94
= .
3
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 12
If X has the probability density

e −x for x > 0,
f (x) =
0 elsewhere

find the expected value of g (X ) = e 3X /4 .

Solution.
 ∞  c
3X /4
e 4 e −x dx
3x
E [g (X )] = E [e ]= g (x)f (x)dx = lim
−∞ c→∞ 0
 c c
− x4 − x4  c
= lim e dx = −4 lim e  = −4 lim (e − 4 − e 0 )
c→∞ 0 c→∞ 0 c→∞

= 4.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Some Theorems on Expectation

The determination of mathematical expectation can often be


simplified by using the following theorems. Since the steps are
essentially the same, some proofs will be given either for the
discrete case or the continuous case; others are left for the reader
as exercises.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 13
If a and b are any constants, then

E (aX + b) = aE (X ) + b.

Proof.
Using Theorem 9 with g (X ) = aX + b, we get

 ∞
E (aX + b) = (ax + b)f (x)dx
−∞
 ∞  ∞
=a xf (x)dx +b f (x)dx
−∞ −∞



E (X ) 1

= aE (X ) + b.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

If we set b = 0 or a = 0, it follows from Theorem 13 that


Corollary 14
If a is a constant, then

E (aX ) = aE (X ).

Corollary 15
If b is a constant, then
E (b) = b.

Theorem 16
If c1 , c2 , and cn are constants, then

E [c1 g1 (X ) + c2 g2 (X ) + · · · + cn gn (X )]
= c1 E [g1 (X )] + c2 E [g2 (X )] + · · · + cn E [gn (X )].
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

The concept of a mathematical expectation can easily be extended


to situations involving more than one random variable. For
instance, if Z is the random variable whose values are related to
those of the two random variables X and Y by means of the
equation z = g (x, y ), it can be shown that
Theorem 17
If X and Y are discrete random variables and f (x, y ) is the value
of their joint probability distribution at (x, y ), the expected value
of g (X , Y ) is

E [g (X , Y )] = g (x, y ) · f (x, y ).
x y
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 18
If X and Y are any random variables, then

E (X + Y ) = E (X ) + E (Y ).

Proof. Let f (x, y ) be the joint probability distribution of X and Y ,


assumed discrete. Then

E (X + Y ) = (x + y )f (x, y )
x y
 
= xf (x, y ) + yf (x, y )
x y x y

= E (X ) + E (Y ).

Note that the theorem is true whether or not X and Y are


independent.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 19
If X and Y are independent random variables, then

E (X · Y ) = E (X ) · E (Y ).

Proof. Let f (x, y ) be the joint probability distribution of X and Y ,


assumed discrete. If the variables X and Y are independent, we
have f (x, y ) = g (x) · h(y ). Therefore,
 
E (X · Y ) = (x · y )f (x, y ) = (x · y )g (x) · h(y )
x y x y
 
 
= x · g (x) y · h(y )
x y

= (x · g (x)E (Y )) = E (X ) · E (Y ).
x
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Note that the validity of this theorem hinges on whether f (x, y )


can be expressed as a function of x multiplied by a function of y ,
for all x and y , i.e., on whether X and Y are independent. For
dependent variables it is not true in general.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 20
Suppose the probability distribution of the discrete random variable
X is
x 0 1 2 3
f (x) 0.2 0.1 0.4 0.3

Find E (X ), E (X 2 ). Knowing that, what is E (4X 2 ) and


E (3X + 2X 2 )?

Solution.
E (X ) = 3x=0 xf (x) = 0 · 0.2 + 1 · 0.1 + 2 · 0.4 + 3 · 0.3 = 1.8,
E (X 2 ) = 3x=0 x 2 f (x) = 02 · 0.2 + 12 · 0.1 + 22 · 0.4 + 32 · 0.3 = 4.4,
E (4X 2 ) = 4E (X 2 ) = 4 · 4.4 = 17.6,
E (3X + 2X 2 ) = 3E (X ) + 2E (X 2 ) = 3 · 1.8 + 2 · 4.4 = 14.2.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 21
If the probability density of X is given by

2(1 − x) for 0 < x < 1
f (x) =
0 elsewhere,

(a) show that


2
E (X r ) = ,
(r + 1)(r + 2)
(b) and use this result to evaluate

E [(2X + 1)2 ].
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product


2(1 − x) for 0 < x < 1
f (x) =
0 elsewhere,

Solution.
(a)
 ∞
r
E (X ) = x r f (x)dx
−∞
 0  1  ∞
r r
= x f (x) dx + x f (x) dx + x r f (x) dx
−∞
0
1

0 2(1−x) 0
  r +2 1
1
x r +1 x
=2 (x r − x r +1 )dx = 2 −
r + 1 r +2
0  0
1 1 2
=2 − = .
r +1 r +2 (r + 1)(r + 2)
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

E (X r ) = 2
(r +1)(r +2)

(b) Since

E [(2X + 1)2 ] = E [4X 2 + 4X + 1] = 4E (X 2 ) + 4E (X ) + 1

and substitution of r = 1 and r = 2 into the preceding


formula yields
2 1 2 1
E (X ) = = and E (X 2 ) = = ,
(2)(3) 3 (3)(4) 6
we get
1 1
E [(2X + 1)2 ] = 4 · + 4 · + 1 = 3.
6 3
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Moments

Definition 22
The r th moment about the origin of a random variable X ,
denoted by μr , is the expected value of X r ; symbolically,

μr = E (X r ) = x r f (x)
x

for r = 0, 1, 2, · · · when X is discrete, and


 ∞
μr = E (X r ) = x r f (x)dx
−∞

when X is continuous.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

1 When r = 0, we have μ0 = E (X 0 ) = E (1) = 1.


2 When r = 1, we have μ1 = E (X 1 ) = E (X ), which is just the
expected value of the random variable X , and we give it a
special symbol and a special name.

Definition 23
μ1 is called the mean of the distribution function of X , or simply
the mean of X , and it is denoted by μ.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

The special moments we shall define next are of importance in


statistics because they serve to describe the shape of the
distribution of a random variable, that is, the shape of the graph of
its probability distribution or probability density.
Definition 24
The r th moment about the mean of a random variable X ,
denoted by μr , is the expected value of (X − μ)r ; symbolically,

μr = E [(X − μ)r ] = (x − μ)r f (x)
x

for r = 0, 1, 2, · · · when X is discrete, and


 ∞
r
μr = E [(X − μ) ] = (x − μ)r f (x)dx
−∞

when X is continuous.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

1 When r = 0, we have μ0 = E [(X − μ)0 ] = 1.


2 When r = 1, we have
μ1 = E [(X − μ)1 ] = E (X − μ) = E (X ) − E (μ) = μ − μ = 0.
for any random variable for which μ exists.
The second moment about the mean is of special importance in
statistics because it is indicative of the spread or dispersion of the
distribution of a random variable; thus, it is given a special symbol
and a special name.
Definition 25
μ2 is called the variance of the distribution of X , or simply the
variance of X , and it is denoted by σ 2 , var(X ), or V (X ); σ, the
positive square root of the variance, is called the standard
deviation.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

This figure shows how the variance reflects the spread or dispersion
of the distribution of a random variable. Here we show the
histograms of the probability distributions of four random variables
with the same mean μ = 5 but variances equaling 5.26, 3.18, 1.66,
and 0.88.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

A large standard deviation indicates that the data points are far
from the mean and a small standard deviation indicates that they
are clustered closely around the mean. For example, each of the
three populations {0, 0, 14, 14}, {0, 6, 8, 14} and {6, 6, 8, 8} has a
mean of 7. Their standard deviations are 7, 5, and 1, respectively.
The third population has a much smaller standard deviation than
the other two because its values are all close to 7. It will have the
same units as the data points themselves. If, for instance, the data
set {0, 6, 8, 14} represents the ages of a population of four siblings
in years, the standard deviation is 5 years. As another example, the
population {1000, 1006, 1008, 1014} may represent the distances
traveled by four athletes, measured in meters. It has a mean of
1007 meters, and a standard deviation of 5 meters.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 26

σ 2 = μ2 − μ2 .

Proof.

σ 2 = E [(X − μ)2 ] = E (X 2 − 2X μ + μ2 )
= E (X 2 ) − 2μ E (X ) +μ2


μ

= E (X ) − 2μ + μ2
2 2

= E (X 2 ) −μ2


μ2

= μ2 − μ2
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 27
Use Theorem 26 to calculate the variance of X , representing the
number of points rolled with a balanced die.

Solution. First we compute


1 1 1 1 1 1 7
μ = E (X ) = 1 · +2· +3· +4· +5· +6· = .
6 6 6 6 6 6 2
Now,
1 1 1 1 1 1 91
μ2 = E (X 2 ) = 12 · + 22 · + 32 · + 42 · + 52 · + 62 · =
6 6 6 6 6 6 6
and it follows that
 2
91 7 35
σ =2
μ2 −μ =2
− = .
6 2 12
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 28
With reference to Example 8, find the standard deviation of the
random variable X .
Solution. In Example 8 we showed that μ = E (X ) = ln 4
π . Now,
 1   
x2
4 4 1 1
μ2 = E (X 2 ) = dx = 1 − dx
0 1+x π 2 π 0 1 + x2
4 4 
= (x − arctan x)10 = (1 − arctan 1) − (0 − arctan 0)
π π
4 π  4
= 1− − (0 − 0) = − 1
π 4 π
and it follows that
 2
4 ln 4
σ2 = −1− ≈ 0.078519272
π π

and σ = 0.078519272 = 0.280212905.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 29
Find μ, μ2 , and σ 2 for the random variable X that has the
probability distribution f (x) = 12 for x = −2 and x = 2.

Solution.
 1 1
μ = E (X ) = xf (x) = (−2) + 2 = 0,
x
2 2

 1 1
μ2 = E (X 2 ) = x 2 f (x) = (−2)2 + 22 = 4,
x
2 2

σ = μ2 = μ2 − μ2 = 4 − 02 = 4,
2

or
 1 1
σ 2 = μ2 = E [(X −μ)2 ] = (x−μ)2 f (x) = (−2−0)2 +(2−0)2 = 4.
x
2 2
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 30
Find μ, μ2 , and σ 2 for the random variable X that has the
probability density

x/2 for 0 < x < 2,
f (x) =
0 elsewhere.

Solution.
  2
∞ 2
x x 3  4
μ = E (X ) = xf (x)dx = x dx =  = ,
−∞ 0 2 6 0 3
  2
∞ 2
2x x 4 
μ2 2
= E (X ) = 2
x f (x)dx = x dx = = 2,
−∞ 0 2 8 0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

 2
4 2
σ = μ2 =
2
μ2 −μ =2−
2
= ,
3 9
or
 ∞  2 2
4 x
σ = μ2 = E [(X − μ) ] =
2
(x − μ) f (x)dx =
2
x− 2
dx
−∞ 0 3 2
 2   4  2
1 8 16 1 x 8x 3 8x 2
= x 3 − x 2 + x dx = − +
2 0 3 9 2 3 9 9 0
2
=
9
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 31
If X has the variance σ 2 , then

var(aX + b) = a2 σ 2 .

Proof.
Since
E [aX + b] = aE (X ) + b,
and
E [(aX + b)2 ] = a2 E (X 2 ) + 2abE (X ) + b 2
then we have
σ 2 = a2 E (X 2 ) + 2abE (X ) + b 2 − [aE (X ) + b]2
= a2 E (X 2 ) + 2abE (X ) + b 2 − a2 [E (X )]2 − 2abE (X ) − b 2
= a2 E (X 2 )
= a2 σ 2 .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

var(aX + b) = a2 σ 2 .

1 For a = 1, we find that the addition of a constant to the


values of a random variable, resulting in a shift of all the
values of X to the left or to the right, in no way affects the
spread of its distribution.
2 For b = 0 we find that if the values of a random variable are
multiplied by a constant, the variance is multiplied by the
square of that constant, resulting in a corresponding change in
the spread of the distribution.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 32
If X and Y are independent random variables,

var (X + Y ) = var (X ) + var (Y ), var (X − Y ) = var (X ) + var (Y ).

Proof.
The proof is left as an exercise.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 33
Suppose that X and Y are independent, with var(X ) = 6,
E (X 2 Y 2 ) = 150, E (Y ) = 2, E (X ) = 2. Compute var(Y ).

Solution.

var(X ) = 6 = E (X 2 ) − (E (X ))2 = E (X 2 ) − 22 ⇒ E (X 2 ) = 10.

Since X and Y are independent, we have E (XY ) = E (X )E (Y ),


thus E (X 2 Y 2 ) = E (X 2 )E (Y 2 ). Therefore,

E (X 2 Y 2 ) = 150 = E (X 2 )E (Y 2 ) = 10E (Y 2 ) ⇒ E (Y 2 ) = 15.

Also, since var(Y ) = E (Y 2 ) − (E (Y ))2 , then we obtain that

var (Y ) = E (Y 2 ) − (E (Y ))2 = 15 − 22 = 11.


Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Chebyshev’s Theorem

Theorem 34 (Chebyshev’s Theorem)


Suppose that X is a random variable (discrete or continuous)
having mean μ and variance σ 2 , which are finite. Then if ε is any
positive number
σ2
P(|X − μ| ≥ ε) ≤ 2
ε
or, with ε = kσ
1
P(|X − μ| ≥ kσ) ≤ 2 .
k
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 35
A random variable X has density function given by

2e −2x , x ≥ 0,
f (x) =
0, x < 0.

(a) Find P(|X − μ| ≥ 1).


(b) Use Chebyshev’s theorem to obtain an upper bound on
P(|X − μ| ≥ 1) and compare with the result in (a).
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution.
(a) First, we need to find μ:
 ∞  ∞
μ = E (X ) = xf (x)dx = x2e −2x dx
 c −∞ 0

= lim x2e −2x dx(x = u ⇒ dx = du, 2e −2x dx = dv ⇒ −e −2x = v )


c→∞ 0
  c   

−2x c −2x −2x 1 −2x c
= lim −xe 0
+ e dx = lim −xe − e
c→∞ 0 c→∞ 2
 c     0 
1  1 1
= − lim e −2x x +  = − lim e −2c c + − e0 0 +
c→∞ 2 0  c→∞ 2 2
 
1 1
=− 0− = = 0.5.
2 2

Since P(|X − μ| ≥ 1) = 1 − P(|X − μ| < 1) and


|X − μ| = |X − 0.5| < 1 ⇔ −1 < X − 0.5 < 1 ⇔ −1 + 0.5 < X <
1 + 0.5 ⇒ −0.5 < X < 1.5 then
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product


2e −2x , x ≥ 0,
f (x) =
0, x < 0.

P(|X − μ| ≥ 1) = 1 − P(|X − μ| < 1) = 1 − P(−0.5 < X < 1.5)


 1.5  1.5
=1− f (x)dx = 1 − 2e −2x dx
−0.5 0
 1.5
=1+ e −2x 0 = 1 + (e −3
− e 0 ) = e −3 = 0.049787068.

(b) If we take ε = 1 in Chebyshev’s theorem we obtain that

σ2
P(|X − μ| ≥ 1) ≤ = σ2 .
12
One can show that σ 2 = 0.25. So an upper bound for the given
probability is 0.25 which is quite far from the real value.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 36
If a random variable X is such that E [(X − 1)2 ] = 10,
E [(X − 2)2 ] = 6,
(a) Find the mean of X .
(b) Find the standard deviation of X .
  √ 
(c) Find an upper bound for P X − 72  ≥ 15 .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution.
(a) Since
E [(X − 1)2 ] = E [X 2 − 2X + 1] = E [X 2 ] − 2E [X ] + 1 = 10 ⇒
E [X 2 ] − 2E [X ] = 9,
E [(X − 2)2 ] = E [X 2 − 4X + 4] = E [X 2 ] − 4E [X ] + 4 = 6 ⇒
E [X 2 ] − 4E [X ] = 2,
then we have that
7
E [X 2 ] = 16 and μ = E [X ] = .
2
   √
(b) σ = Var (X ) = E [X 2 ] − (E [X ])2 = 16 − 49
4 = 2 .
15

(c) If we use Chebyshev’s inequality by taking μ = 2, ε =
7
15
and σ 2 = 15
4 , we obtain
  
σ2 
 7  √ 15/4 1
P(|X − μ| ≥ ε) ≤ 2 ⇒ P X −  ≥ 15 ≤ = .
ε 2 15 4
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Moment Generating Functions


Although the moments of most distributions can be determined directly
by evaluating the necessary integrals or sums, an alternative procedure
sometimes provides considerable simplification. This technique utilizes
moment-generating functions.

Definition 37
The moment-generating function of a random variable X , where
it exists, is given by

MX (t) = E [e tX ] = e tx f (x)
x

when X is discrete and


 ∞
MX (t) = E [e tX ] = e tx f (x)dx
−∞

when X is continuous.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

To explain why we refer to this function as a moment-generating function


let us substitute for e tx its Maclaurin’s series expansion, that is

t 2x 2 t 3x 3 tr xr
e tx = 1 + tx + + + ···+ + ··· .
2! 3! r!
For discrete case, we thus get
 t 2x 2 tr xr 
MX (t) = 1 + tx + + ···+ + · · · f (x)
x
2! r!
  t2  2 tr  r
= f (x) +t xf (x) + x f (x) + · · · + x f (x) + · · ·
x x
2! x r! x





1 E [X ]=µ1 =µ µ2 µr

t2 tr
= 1 + μt + μ2 + · · · + μr + · · ·
2! r!
and it can be seen that in the Maclaurin’s series of the moment
generating function of X the coefficient of t r /r ! is μr , the r th moment
about the origin. In the continuous case, the argument is the same.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 38
Find the moment generating function of the random variable
whose probability density is given by

e −x for x > 0,
f (x) =
0 elsewhere

and use it to find an expression for μr .


Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution. By definition
 ∞  ∞
MX (t) = E (e tX ) = e tx e −x dx = e −x(1−t) dx
 c 0 0
c
−x(1−t) 1 −x(1−t) 
= lim e dx = − lim e 
c→∞ 0 1 − t c→∞ 0
1 1
=− lim (e −c(1−t) − e 0 ) = for t < 1.
1 − t c→∞ 1−t
As is well known, when |t| < 1 the Maclaurin’s series for this
moment generating function is
1
MX (t) = = 1 + t + t2 + t3 + · · · + tr + . . .
1−t
t t2 t3 tr
= 1 + 1! + 2! + 3! + · · · + r ! + · · ·
1! 2! 3! r!
and hence μr = r ! for r = 0, 1, 2, · · · .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 39


d r MX (t) 
= μr .
dt r t=0

This follows from the fact that if a function is expanded as a power


series in t, the coefficient of t r /r ! is the r th derivative of the
function with respect to t at t = 0.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 40
 
Given that X has the probability distribution f (x) = 18 x3 for
x = 0, 1, 2 and 3, find the moment-generating function of this
random variable and use it to determine μ1 and μ2 .

Solution. Since

3  
3 1
MX (t) = E (e tx ) = e tx = (1 + 3e t + 3e 2t + 3e 3t )
x 8
x=0
1
= (1 + e t )3 .
8
Then, by Theorem 39,

 3  3
   t 2 t
μ1 = MX (t) t=0 = (1 + e ) e  = ,
8 t=0 2
and
3 
 3 
μ2 = MX (t)t=0 = t 2 2t
(1 + e ) e t 2 t
+ (1 + e ) e  = 3.
4 8 t=0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Often the work involved in using moment-generating functions can


be simplified by making use of the following theorem:
Theorem 41
If a and b are constant, then
(a) MX +a (t) = E [e (X +a)t ] = e at MX (t);
(b) MbX (t) = E (e bXt ) = MX (bt);
 X +a  a  
(c) M X +a (t) = E e ( b )t = e b t M t . X b
b

Proof.
The proof is left as an exercise.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Product Moments

Definition 42
The r th and sth product moment about the origin of the
random variables X and Y , denoted by μr ,s , is the expected value
of X r Y s ; symbolically

μr ,s = E (X r Y s ) = x r y s f (x, y )
x y

for r = 0, 1, 2, · · · and s = 0, 1, 2, · · · when X and Y are discrete.

The double summation extends over the entire joint range of the
two random variables. Note that μ1,0 = E (X ), which we denote
here μX , and that μ0,1 = E (Y ), which we denote here by μY .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Definition 43
The r th and sth product moment about the means of the
random variables X and Y , denoted by μr ,s , is the expected value
of (X − μX )r (Y − μY )s ; symbolically

μr ,s = E [(X − μX )r (Y − μY )s ] = (x − μX )r (y − μY )s f (x, y )
x y

for r = 0, 1, 2, · · · and s = 0, 1, 2, · · · when X and Y are discrete.

In statistics, μ1,1 is of special importance because it is indicative of


the relationship, if any, between the values of X and Y ; thus it is
given a special symbol and special name.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Definition 44
μ1,1 is called the covariance of X and Y , and it is denoted by
σXY , cov (X , Y ), or C (X , Y ).

Observe that if there is a high probability that large values of X


will go with large values of Y and small values of X with small
values of Y , the covariance will be positive, if there is a high
probability that large values of X will go with small values of Y ,
and vice versa, the covariance will be negative.
y
− +
μY
+ −

x
μX
It is in this sense that the covariance measures the relationship, or
association, between the values of X and Y .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 45
σXY = μ1,1 − μX μY .

Proof.
Using the various theorems about expected values, we can write

σXY = E [(X − μX )(Y − μY )]


= E [XY − X μY − Y μX + μX μY ]
= E (XY ) − μY E (X ) − μX E (Y ) + μX μY
= E (XY ) − μY μX − μX μY + μX μY
= μ1,1 − μX μY .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 46

We know that, the joint and marginal probabilities of X and Y ,


the numbers of aspirin and sedative caplets among two caplets
drawn at random from a bottle containing three aspirin, two
sedative, and four laxative caplets were recorded as follows:
HH x
HH 0 1 2 h(y )
y HH
0 6/36 12/36 3/36 21/36
1 8/36 6/36 14/36
2 1/36 1/36
g (x) 15/36 18/36 3/36 1

Find the covariance of X and Y .


Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

HH x
HH 0 1 2 h(y )
y HH
0 6/36 12/36 3/36 21/36
1 8/36 6/36 14/36
2 1/36 1/36
g (x) 15/36 18/36 3/36 1

Solution. Referring to the joint probabilities given here, we get


2 
2
μ1,1 =E (XY ) = xyf (x, y )
x=0 y =0
6 12 3
=0 · 0 ·+1·0· +2·0·
36 36 36
8 6 1 6
+0·1· +1·1· +0·2· =
36 36 36 36
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

H
HH x
HH 0 1 2 h(y )
y H
0 6/36 12/36 3/36 21/36
1 8/36 6/36 14/36
2 1/36 1/36
g (x) 15/36 18/36 3/36 1

and using the marginal probabilities, we get


2 
2
15 18 3 24
μX = E (X ) = xf (x, y ) = xg (x) = 0· +1· +2· =
36 36 36 36
x=0 x=0

and

2 
2
21 14 1 16
μY = E (Y ) = yf (x, y ) = yh(y ) = 0· +1· +2· = .
36 36 36 36
y =0 y =0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

It follows that
6 24 16 7
σXY = μ1,1 − μX μY = − · =− .
36 36 36 54
The negative result suggests that the more aspirin tablets we get
the fewer sedative tables we will get, and vice versa, and this, of
course, make sense.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 47

A quarter is bent so that the probabilities of heads and tails are 0.4
and 0.6. If it is tossed twice, what is the covariance of Z , the
number of heads obtained on the first toss, and W , the total
number of heads obtained in the two tosses of the coin.
Solution.

f (0, 0) = P(T , T ) = 0.6·0.6 = 0.36, f (1, 1) = P(H, T ) = 0.4·0.6 = 0.24,

f (0, 1) = P(T , H) = 0.6·0.4 = 0.24, f (1, 2) = P(H, H) = 0.4·0.4 = 0.16.


H
HH w
HH 0 1 2 g (z)
z H
0 0.36 0.24 0.60
1 0.24 0.16 0.40
h(w ) 0.36 0.48 0.16 1
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

σZW = μ1,1 − μZ μW = E (ZW ) − E (Z )E (W ).



1 
2
μ1,1 = E (ZW ) = f (z, w )
z=0 w =1
= 0 · 0 · 0.36 + 0 · 1 · 0.24 + 1 · 1 · 0.24 + 1 · 2 · 0.16 = 0.56,

1 
1
μZ = E (Z ) = f (z, w ) = g (z) = 0 · 0.60 + 1 · 0.40 = 0.40,
z=0 z=0
2 2
μW = E (W ) = f (z, w ) = h(w ) = 0 · 0.36 + 1 · 0.48 + 2 · 0.16
w =0 w =0
= 0.80 ⇒
σZW = 0.56 − 0.40 · 0.80 = 0.24.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

As far as the relation between X and Y is concerned, observe that


if X and Y are independent, their covariance is zero; symbolically,
Theorem 48
If X and Y are independent, then σXY = 0.

Proof.
If X and Y are independent then we know that

E (XY ) = E (X )E (Y ).

Since
σXY = μ1,1 − μX μY = E (XY ) − E (X )E (Y )
thus
σXY = E (X )E (Y ) − E (X )E (Y ) = 0.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

It is of interest to note that the independence of two random


variables implies a zero covariance, but a zero covariance does not
necessarily imply their independence. This is illustrated by the
following example.
Example 49
If the joint distribution of X and Y is given by
H
HH x
HH −1 0 1 h(y )
y H
−1 1/6 2/6 1/6 4/6
0 0 0 0 0
1 1/6 0 1/6 2/6
g (x) 2/6 2/6 2/6 1

show that their covariance is zero even though the two random
variables are not independent.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution. Using the probabilities shown in the margins, we get


1
2 2 2
μX = xg (x) = (−1) · + 0 · + 1 · = 0,
6 6 6
x=−1

1
4 2 2
μY = yh(y ) = (−1) · +0·0+1· = − ,
6 6 6
y =−1


1 
1
μ1,1 = xyf (x, y )
x=−1 y =−1
1 1 1 1
= (−1)(−1) + 1(−1) · + (−1)1 · + 1 · 1 · = 0.
6 6 6 6
 2
Thus, σXY = 0 − 0 − 6 = 0, the covariance is zero, but the two
random variables are not independent. For instance,
f (x, y ) = g (x)h(y ) for x = −1 and y = −1.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Theorem 50
Var (X ± Y ) = Var (X ) + Var (Y ) ± 2Cov (X , Y )
or
σX2 ±Y = σX2 + σY2 ± 2σXY .

Proof.
Var (X ± Y )
=E [((X ± Y ) − (μX ± μY ))2 ]
=E [(X ± Y )2 − 2(X ± Y )(μX ± μY ) + (μX ± μY )2 ]
=E [X 2 ± 2XY + Y 2 − 2X μX ∓ 2X μY ∓ 2Y μX − 2Y μY
+ μ2X ± 2μX μY + μ2Y ]
=E [(X − μX )2 + (Y − μY )2 ± 2XY ∓ 2X μY ∓ 2Y μX ± 2μX μY ]
=E [(X − μX )2 ] + E [(Y − μY )2 ] ± 2E (XY ) ∓ 2μY E (X ) ∓ 2μX E (Y ) ± 2μX μY
=Var (X ) + Var (Y ) ± 2Cov (X , Y ).
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Outline

1 Introduction

2 The Expected Value of a Random Variable

3 Moments

4 Chebyshev’s Theorem

5 Moment Generating Functions

6 Product Moments

7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Conditional Expectation

Definition 51
If X is a discrete random variable and f (x|y ) is the value of the
conditional probability distribution of X given Y = y at x, the
conditional expectation of u(X ) given Y = y is

E [u(X )|y ] = u(x)f (x|y ).
x

Similar expressions based on the conditional probability distribution


of Y given X = x define the conditional expectation of v (Y ) given
X = x.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Definition 52
If we let u(X ) = X in Definition 51, we obtain the conditional
mean of the random variable X given Y = y , which we denote by

μX |y = E (X |y ) = xf (x|y ).
x

Definition 53
Correspondingly, the conditional variance of X given Y = y is

σX2 |y = E [(X − μX |y )2 |y ] = E (X 2 |y ) − μ2X |y

where E (X 2 |y ) is given by Definition 51 with u(X ) = X 2


Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 54
With reference to Example 46, find the conditional mean of X
given Y = 1.

Solution. Making use of the results obtained before, that is,


f (0|1) = 47 , f (1|1) = 37 and f (2|1) = 0, we get


2
4 3 3
E (X |1) = xf (x|1) = 0 · +1· +2·0= .
7 7 7
x=0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Example 55
Suppose that we have the following joint probability distribution
H
HH x
HH 1 2 3 4 h(y )
y H
1 0.01 0.07 0.09 0.03 0.20
2 0.20 0.05 0.25 0.50
3 0.09 0.03 0.06 0.12 0.30
g (x) 0.30 0.10 0.20 0.40 1

(a) Find the conditional expectation of u(x) = x 2 given Y = 2.


(b) Find the conditional mean of X given Y = 2.
(c) Find the conditional variance of X given Y = 2.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Solution. Since
f (x|y ) = f h(y
(x,y ) f (x,2) f (x,2)
) ⇒ f (x|2) = h(2) = 0.5 = 2f (x, 2), then we
have ⎧

⎪2f (1, 2) = 0.40, x = 1,


⎨2f (2, 2) = 0.00, x = 2,
f (x|2) =

⎪2f (3, 2) = 0.10, x = 3,


⎩2f (4, 2) = 0.50, x = 4.

(a) The conditional expectation of u(x) = x 2 given Y = 2 is


4
E [X |Y = 2] =
2
x 2 f (x|2)
x=1
= 1 f (1|2) + 22 f (2|2) + 32 f (3|2) + 42 f (4|2)
2

= 1(0.40) + 4(0.00) + 9(0.10) + 16(0.50)


= 9.30.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

(b) The conditional mean of X given Y = 2 is


4
E [X |Y = 2] = μX |2 = xf (x|2)
x=1
= 1f (1|2) + 2f (2|2) + 3f (3|2) + 4f (4|2)
= 1(0.40) + 2(0.00) + 3(0.10) + 4(0.50)
= 2.70
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

(c) The conditional variance of X given Y = 2.


4
E [(X − μX |2 )2 |2] =σX2 |2 = (x − 2.7)2 f (x|2)
x=1
=(1 − 2.7)2 f (1|2) + (2 − 2.7)2 f (2|2)
+ (3 − 2.7)2 f (3|2) + (4 − 2.7)2 f (4|2)
=2.89(0.40) + 0.49(0.00) + 0.09(0.10) + 1.69(0.50)
=2.01

or simply

E [(X − μX |2 )2 |2] =σX2 |2 = E (X 2 |Y = 2) − μ2X |2


=9.30 − (2.70)2
=2.01.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product

Thank You!!!

You might also like