Professional Documents
Culture Documents
Outline
1 Introduction
3 Moments
4 Chebyshev’s Theorem
6 Product Moments
7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Outline
1 Introduction
3 Moments
4 Chebyshev’s Theorem
6 Product Moments
7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Introduction
Outline
1 Introduction
3 Moments
4 Chebyshev’s Theorem
6 Product Moments
7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Definition 1
If X is a discrete random variable and f (x) is the value of its
probability distribution at x, the expected value of X is
E (X ) = xf (x).
x
Example 2
Imagine a game in which, on any play, a player has a 20% chance
of winning $3 and an 80% chance of losing $1. The probability
distribution function of the random variable X , the amount won or
lost on a single play is
x $3 −$1
f (x) 0.2 0.8
What does “in the long run” mean? If you play, are you
guaranteed to lose no more than 20 cents?
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Solution. If you play and lose, you are guaranteed to lose $1. An
expected loss of 20 cents means that if you played the game over
and over and over and over · · · again, the average of your $3
winnings and your $1 losses would be a 20 cent loss. “In the long
run” means that you can’t draw conclusions about one or two
plays, but rather thousands and thousands of plays.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 3
A lot of 12 television sets includes 2 with white cords. If three of
the sets are chosen at random for shipment to a hotel, how many
sets with white cords can the shipper expect to send to the hotel?
x 0 1 2
f (x) 6/11 9/22 1/22
Now,
2
6 9 1 1
E (X ) = xf (x) = 0 · +1· +2· =
11 22 22 2
x=0
Example 4
A saleswoman has been offered a new job with a fixed salary of
$290. Her records from her present job show that her weekly
commissions have the following probabilities:
Example 5
Dan, Eric, and Nick decide to play a fun game. Each player puts
$2 on the table and secretly writes down either heads or tails.
Then one of them tosses a fair coin. The $6 on the table is divided
evenly among the players who correctly predicted the outcome of
the coin toss. If everyone guessed incorrectly, then everyone takes
their money back. After many repetitions of this game, Dan has
lost a lot of money - more than can be explained by bad luck.
What’s going on?
Eric right? Y
$0 1/8
1/2
Y
1/2
1/2 N $1 1/8
Dan right?
1/2 Y $1 1/8
Y N 1/2
1/2
1/2
N $4 1/8
Y −$2 1/8
1/2
1/2
N Y 1/2
1/2 N −$2 1/8
In the payoff column, we are accounting for the fact that Dan has
to put in $2 just to play. So, for example, if he guesses correctly
and Eric and Nick are wrong, then he takes all $6 on the table, but
his net profit is only $4. Working from the tree diagram, Dan’s
expected payoff is:
1 1 1 1 1 1
E (payoff) =0 · + 1 · + 1 · + 4 · + (−2) · + (−2) ·
8 8 8 8 8 8
1 1
+ (−2) · + 0 ·
8 8
=0.
So the game perfectly fair! Over time, he should neither win nor
lose money.
Let say that Nick and Eric are collaborating; in particular, they
always make opposite guesses. So our assumption everyone is
correct independently is wrong; actually the events that Nick is
correct and Eric is correct are mutually exclusive! As a result, Dan
can never win all the money on the table. When he guesses
correctly, he always has to split his winnings with someone else.
This lowers his overall expectation, as the corrected tree diagram
below shows:
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Eric right? Y
$0 0
0
Y
1
1/2 N $1 1/4
Dan right?
1/2 Y $1 1/4
Y N 1
1/2
0
N $4 0
Y −$2 0
0
1/2
N Y 1
1/2 N −$2 1/4
From this revised tree diagram, we can work out Dan’s actual
expected payoff
1 1 1
E (payoff) =0 · 0 + 1 · + 1 · + 4 · 0 + (−2) · 0 + (−2) ·
4 4 4
1
+ (−2) · + 0 · 0
4
1
=− .
2
So he loses an average of a half-dollar per game!
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 6
Consider the following game in which two players each spin a fair
wheel in turn
Player A Player B
4 5
2
12 6
The player that spins the higher number wins, and the loser has to
pay the winner the difference of the two numbers (in dollars). Who
would win more often? What is that player’s expected winnings?
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
1 1 1 1 1 1
E (X ) = $3 · + $4 · + $1 · + $2 · + (−$7) · + (−$6) ·
6 6 6 6 6 6
= −$0.50.
Example 7
Find the expected value of the random variable X whose
probability density is given by
⎧
⎪
⎨x for 0 < x < 1,
f (x) = 2 − x for 1 ≤ x < 2,
⎪
⎩
0 elsewhere.
Solution.
∞
E (X ) = xf (x)dx
−∞
0 1 2 ∞
= x f (x) dx + x f (x) dx + x f (x) dx + x f (x) dx
−∞
0
1
0
0 x 2−x 0
1 2
x3
+ x3 = 1 + 22 − 1 23
= x2 − − 1 + = 1.
3 0 3
1 3 3 3
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 8
Solution.
∞ 1
4dx 2 1 2xdx
E (X ) = xf (x)dx = x· = ·
−∞ 0 π(1 + x 2 ) π 0 (1 + x 2 )
2 1 2 ln 4
= ln(x 2 + 1)0 = (ln 2 − ln 1) = ≡ 0.4412712003.
π π π
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 9
If X is a discrete random variable and f (x) is the value of its
probability distribution at x, the expected value of g (X ) is given by
E [g (X )] = g (x)f (x).
x
Example 10
Roulette, introduced in Paris in 1765, is a very popular casino
game. Suppose that Hannah’s House of Gambling has a roulette
wheel containing 38 numbers: zero (0), double zero (00), and the
numbers 1, 2, 3, · · · , 36. Let X denote the number on which the
ball lands and u(x) denote the amount of money paid to the
gambler, such that:
⎧
⎪
⎪ $5 if x = 0,
⎪
⎪
⎨$10 if x = 00,
u(x) =
⎪$1
⎪ if x is odd,
⎪
⎪
⎩$2 if x is even.
Example 11
If X is the number of points rolled with a balanced die, find the
expected value of g (X ) = 2X 2 + 1.
6
6
1
E [g (X )] =E [2X 2 + 1] = g (x)f (x) = (2x 2 + 1)
6
x=1 x=1
1 1 1
=(2 · 12 + 1) + (2 · 22 + 1) + · · · + (2 · 62 + 1)
6 6 6
94
= .
3
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 12
If X has the probability density
e −x for x > 0,
f (x) =
0 elsewhere
Solution.
∞ c
3X /4
e 4 e −x dx
3x
E [g (X )] = E [e ]= g (x)f (x)dx = lim
−∞ c→∞ 0
c c
− x4 − x4 c
= lim e dx = −4 lim e = −4 lim (e − 4 − e 0 )
c→∞ 0 c→∞ 0 c→∞
= 4.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 13
If a and b are any constants, then
E (aX + b) = aE (X ) + b.
Proof.
Using Theorem 9 with g (X ) = aX + b, we get
∞
E (aX + b) = (ax + b)f (x)dx
−∞
∞ ∞
=a xf (x)dx +b f (x)dx
−∞ −∞
E (X ) 1
= aE (X ) + b.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
E (aX ) = aE (X ).
Corollary 15
If b is a constant, then
E (b) = b.
Theorem 16
If c1 , c2 , and cn are constants, then
E [c1 g1 (X ) + c2 g2 (X ) + · · · + cn gn (X )]
= c1 E [g1 (X )] + c2 E [g2 (X )] + · · · + cn E [gn (X )].
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 18
If X and Y are any random variables, then
E (X + Y ) = E (X ) + E (Y ).
= E (X ) + E (Y ).
Theorem 19
If X and Y are independent random variables, then
E (X · Y ) = E (X ) · E (Y ).
Example 20
Suppose the probability distribution of the discrete random variable
X is
x 0 1 2 3
f (x) 0.2 0.1 0.4 0.3
Solution.
E (X ) = 3x=0 xf (x) = 0 · 0.2 + 1 · 0.1 + 2 · 0.4 + 3 · 0.3 = 1.8,
E (X 2 ) = 3x=0 x 2 f (x) = 02 · 0.2 + 12 · 0.1 + 22 · 0.4 + 32 · 0.3 = 4.4,
E (4X 2 ) = 4E (X 2 ) = 4 · 4.4 = 17.6,
E (3X + 2X 2 ) = 3E (X ) + 2E (X 2 ) = 3 · 1.8 + 2 · 4.4 = 14.2.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 21
If the probability density of X is given by
2(1 − x) for 0 < x < 1
f (x) =
0 elsewhere,
E [(2X + 1)2 ].
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
2(1 − x) for 0 < x < 1
f (x) =
0 elsewhere,
Solution.
(a)
∞
r
E (X ) = x r f (x)dx
−∞
0 1 ∞
r r
= x f (x) dx + x f (x) dx + x r f (x) dx
−∞
0
1
0 2(1−x) 0
r +2 1
1
x r +1 x
=2 (x r − x r +1 )dx = 2 −
r + 1 r +2
0 0
1 1 2
=2 − = .
r +1 r +2 (r + 1)(r + 2)
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
E (X r ) = 2
(r +1)(r +2)
(b) Since
Outline
1 Introduction
3 Moments
4 Chebyshev’s Theorem
6 Product Moments
7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Moments
Definition 22
The r th moment about the origin of a random variable X ,
denoted by μr , is the expected value of X r ; symbolically,
μr = E (X r ) = x r f (x)
x
when X is continuous.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Definition 23
μ1 is called the mean of the distribution function of X , or simply
the mean of X , and it is denoted by μ.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
when X is continuous.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
This figure shows how the variance reflects the spread or dispersion
of the distribution of a random variable. Here we show the
histograms of the probability distributions of four random variables
with the same mean μ = 5 but variances equaling 5.26, 3.18, 1.66,
and 0.88.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
A large standard deviation indicates that the data points are far
from the mean and a small standard deviation indicates that they
are clustered closely around the mean. For example, each of the
three populations {0, 0, 14, 14}, {0, 6, 8, 14} and {6, 6, 8, 8} has a
mean of 7. Their standard deviations are 7, 5, and 1, respectively.
The third population has a much smaller standard deviation than
the other two because its values are all close to 7. It will have the
same units as the data points themselves. If, for instance, the data
set {0, 6, 8, 14} represents the ages of a population of four siblings
in years, the standard deviation is 5 years. As another example, the
population {1000, 1006, 1008, 1014} may represent the distances
traveled by four athletes, measured in meters. It has a mean of
1007 meters, and a standard deviation of 5 meters.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 26
σ 2 = μ2 − μ2 .
Proof.
σ 2 = E [(X − μ)2 ] = E (X 2 − 2X μ + μ2 )
= E (X 2 ) − 2μ E (X ) +μ2
μ
= E (X ) − 2μ + μ2
2 2
= E (X 2 ) −μ2
μ2
= μ2 − μ2
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 27
Use Theorem 26 to calculate the variance of X , representing the
number of points rolled with a balanced die.
Example 28
With reference to Example 8, find the standard deviation of the
random variable X .
Solution. In Example 8 we showed that μ = E (X ) = ln 4
π . Now,
1
x2
4 4 1 1
μ2 = E (X 2 ) = dx = 1 − dx
0 1+x π 2 π 0 1 + x2
4 4
= (x − arctan x)10 = (1 − arctan 1) − (0 − arctan 0)
π π
4 π 4
= 1− − (0 − 0) = − 1
π 4 π
and it follows that
2
4 ln 4
σ2 = −1− ≈ 0.078519272
π π
√
and σ = 0.078519272 = 0.280212905.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 29
Find μ, μ2 , and σ 2 for the random variable X that has the
probability distribution f (x) = 12 for x = −2 and x = 2.
Solution.
1 1
μ = E (X ) = xf (x) = (−2) + 2 = 0,
x
2 2
1 1
μ2 = E (X 2 ) = x 2 f (x) = (−2)2 + 22 = 4,
x
2 2
σ = μ2 = μ2 − μ2 = 4 − 02 = 4,
2
or
1 1
σ 2 = μ2 = E [(X −μ)2 ] = (x−μ)2 f (x) = (−2−0)2 +(2−0)2 = 4.
x
2 2
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 30
Find μ, μ2 , and σ 2 for the random variable X that has the
probability density
x/2 for 0 < x < 2,
f (x) =
0 elsewhere.
Solution.
2
∞ 2
x x 3 4
μ = E (X ) = xf (x)dx = x dx = = ,
−∞ 0 2 6 0 3
2
∞ 2
2x x 4
μ2 2
= E (X ) = 2
x f (x)dx = x dx = = 2,
−∞ 0 2 8 0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
2
4 2
σ = μ2 =
2
μ2 −μ =2−
2
= ,
3 9
or
∞ 2 2
4 x
σ = μ2 = E [(X − μ) ] =
2
(x − μ) f (x)dx =
2
x− 2
dx
−∞ 0 3 2
2 4 2
1 8 16 1 x 8x 3 8x 2
= x 3 − x 2 + x dx = − +
2 0 3 9 2 3 9 9 0
2
=
9
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 31
If X has the variance σ 2 , then
var(aX + b) = a2 σ 2 .
Proof.
Since
E [aX + b] = aE (X ) + b,
and
E [(aX + b)2 ] = a2 E (X 2 ) + 2abE (X ) + b 2
then we have
σ 2 = a2 E (X 2 ) + 2abE (X ) + b 2 − [aE (X ) + b]2
= a2 E (X 2 ) + 2abE (X ) + b 2 − a2 [E (X )]2 − 2abE (X ) − b 2
= a2 E (X 2 )
= a2 σ 2 .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
var(aX + b) = a2 σ 2 .
Theorem 32
If X and Y are independent random variables,
Proof.
The proof is left as an exercise.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 33
Suppose that X and Y are independent, with var(X ) = 6,
E (X 2 Y 2 ) = 150, E (Y ) = 2, E (X ) = 2. Compute var(Y ).
Solution.
Outline
1 Introduction
3 Moments
4 Chebyshev’s Theorem
6 Product Moments
7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Chebyshev’s Theorem
Example 35
A random variable X has density function given by
2e −2x , x ≥ 0,
f (x) =
0, x < 0.
Solution.
(a) First, we need to find μ:
∞ ∞
μ = E (X ) = xf (x)dx = x2e −2x dx
c −∞ 0
2e −2x , x ≥ 0,
f (x) =
0, x < 0.
σ2
P(|X − μ| ≥ 1) ≤ = σ2 .
12
One can show that σ 2 = 0.25. So an upper bound for the given
probability is 0.25 which is quite far from the real value.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 36
If a random variable X is such that E [(X − 1)2 ] = 10,
E [(X − 2)2 ] = 6,
(a) Find the mean of X .
(b) Find the standard deviation of X .
√
(c) Find an upper bound for P X − 72 ≥ 15 .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Solution.
(a) Since
E [(X − 1)2 ] = E [X 2 − 2X + 1] = E [X 2 ] − 2E [X ] + 1 = 10 ⇒
E [X 2 ] − 2E [X ] = 9,
E [(X − 2)2 ] = E [X 2 − 4X + 4] = E [X 2 ] − 4E [X ] + 4 = 6 ⇒
E [X 2 ] − 4E [X ] = 2,
then we have that
7
E [X 2 ] = 16 and μ = E [X ] = .
2
√
(b) σ = Var (X ) = E [X 2 ] − (E [X ])2 = 16 − 49
4 = 2 .
15
√
(c) If we use Chebyshev’s inequality by taking μ = 2, ε =
7
15
and σ 2 = 15
4 , we obtain
σ2
7 √ 15/4 1
P(|X − μ| ≥ ε) ≤ 2 ⇒ P X − ≥ 15 ≤ = .
ε 2 15 4
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Outline
1 Introduction
3 Moments
4 Chebyshev’s Theorem
6 Product Moments
7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Definition 37
The moment-generating function of a random variable X , where
it exists, is given by
MX (t) = E [e tX ] = e tx f (x)
x
when X is continuous.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
t 2x 2 t 3x 3 tr xr
e tx = 1 + tx + + + ···+ + ··· .
2! 3! r!
For discrete case, we thus get
t 2x 2 tr xr
MX (t) = 1 + tx + + ···+ + · · · f (x)
x
2! r!
t2 2 tr r
= f (x) +t xf (x) + x f (x) + · · · + x f (x) + · · ·
x x
2! x r! x
1 E [X ]=µ1 =µ µ2 µr
t2 tr
= 1 + μt + μ2 + · · · + μr + · · ·
2! r!
and it can be seen that in the Maclaurin’s series of the moment
generating function of X the coefficient of t r /r ! is μr , the r th moment
about the origin. In the continuous case, the argument is the same.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 38
Find the moment generating function of the random variable
whose probability density is given by
e −x for x > 0,
f (x) =
0 elsewhere
Solution. By definition
∞ ∞
MX (t) = E (e tX ) = e tx e −x dx = e −x(1−t) dx
c 0 0
c
−x(1−t) 1 −x(1−t)
= lim e dx = − lim e
c→∞ 0 1 − t c→∞ 0
1 1
=− lim (e −c(1−t) − e 0 ) = for t < 1.
1 − t c→∞ 1−t
As is well known, when |t| < 1 the Maclaurin’s series for this
moment generating function is
1
MX (t) = = 1 + t + t2 + t3 + · · · + tr + . . .
1−t
t t2 t3 tr
= 1 + 1! + 2! + 3! + · · · + r ! + · · ·
1! 2! 3! r!
and hence μr = r ! for r = 0, 1, 2, · · · .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 39
d r MX (t)
= μr .
dt r t=0
Example 40
Given that X has the probability distribution f (x) = 18 x3 for
x = 0, 1, 2 and 3, find the moment-generating function of this
random variable and use it to determine μ1 and μ2 .
Solution. Since
3
3 1
MX (t) = E (e tx ) = e tx = (1 + 3e t + 3e 2t + 3e 3t )
x 8
x=0
1
= (1 + e t )3 .
8
Then, by Theorem 39,
3 3
t 2 t
μ1 = MX (t) t=0 = (1 + e ) e = ,
8 t=0 2
and
3
3
μ2 = MX (t)t=0 = t 2 2t
(1 + e ) e t 2 t
+ (1 + e ) e = 3.
4 8 t=0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Proof.
The proof is left as an exercise.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Outline
1 Introduction
3 Moments
4 Chebyshev’s Theorem
6 Product Moments
7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Product Moments
Definition 42
The r th and sth product moment about the origin of the
random variables X and Y , denoted by μr ,s , is the expected value
of X r Y s ; symbolically
μr ,s = E (X r Y s ) = x r y s f (x, y )
x y
The double summation extends over the entire joint range of the
two random variables. Note that μ1,0 = E (X ), which we denote
here μX , and that μ0,1 = E (Y ), which we denote here by μY .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Definition 43
The r th and sth product moment about the means of the
random variables X and Y , denoted by μr ,s , is the expected value
of (X − μX )r (Y − μY )s ; symbolically
μr ,s = E [(X − μX )r (Y − μY )s ] = (x − μX )r (y − μY )s f (x, y )
x y
Definition 44
μ1,1 is called the covariance of X and Y , and it is denoted by
σXY , cov (X , Y ), or C (X , Y ).
x
μX
It is in this sense that the covariance measures the relationship, or
association, between the values of X and Y .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 45
σXY = μ1,1 − μX μY .
Proof.
Using the various theorems about expected values, we can write
Example 46
HH x
HH 0 1 2 h(y )
y HH
0 6/36 12/36 3/36 21/36
1 8/36 6/36 14/36
2 1/36 1/36
g (x) 15/36 18/36 3/36 1
2
2
μ1,1 =E (XY ) = xyf (x, y )
x=0 y =0
6 12 3
=0 · 0 ·+1·0· +2·0·
36 36 36
8 6 1 6
+0·1· +1·1· +0·2· =
36 36 36 36
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
H
HH x
HH 0 1 2 h(y )
y H
0 6/36 12/36 3/36 21/36
1 8/36 6/36 14/36
2 1/36 1/36
g (x) 15/36 18/36 3/36 1
2
2
15 18 3 24
μX = E (X ) = xf (x, y ) = xg (x) = 0· +1· +2· =
36 36 36 36
x=0 x=0
and
2
2
21 14 1 16
μY = E (Y ) = yf (x, y ) = yh(y ) = 0· +1· +2· = .
36 36 36 36
y =0 y =0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
It follows that
6 24 16 7
σXY = μ1,1 − μX μY = − · =− .
36 36 36 54
The negative result suggests that the more aspirin tablets we get
the fewer sedative tables we will get, and vice versa, and this, of
course, make sense.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 47
A quarter is bent so that the probabilities of heads and tails are 0.4
and 0.6. If it is tossed twice, what is the covariance of Z , the
number of heads obtained on the first toss, and W , the total
number of heads obtained in the two tosses of the coin.
Solution.
Proof.
If X and Y are independent then we know that
E (XY ) = E (X )E (Y ).
Since
σXY = μ1,1 − μX μY = E (XY ) − E (X )E (Y )
thus
σXY = E (X )E (Y ) − E (X )E (Y ) = 0.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
show that their covariance is zero even though the two random
variables are not independent.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
1
2 2 2
μX = xg (x) = (−1) · + 0 · + 1 · = 0,
6 6 6
x=−1
1
4 2 2
μY = yh(y ) = (−1) · +0·0+1· = − ,
6 6 6
y =−1
1
1
μ1,1 = xyf (x, y )
x=−1 y =−1
1 1 1 1
= (−1)(−1) + 1(−1) · + (−1)1 · + 1 · 1 · = 0.
6 6 6 6
2
Thus, σXY = 0 − 0 − 6 = 0, the covariance is zero, but the two
random variables are not independent. For instance,
f (x, y ) = g (x)h(y ) for x = −1 and y = −1.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 50
Var (X ± Y ) = Var (X ) + Var (Y ) ± 2Cov (X , Y )
or
σX2 ±Y = σX2 + σY2 ± 2σXY .
Proof.
Var (X ± Y )
=E [((X ± Y ) − (μX ± μY ))2 ]
=E [(X ± Y )2 − 2(X ± Y )(μX ± μY ) + (μX ± μY )2 ]
=E [X 2 ± 2XY + Y 2 − 2X μX ∓ 2X μY ∓ 2Y μX − 2Y μY
+ μ2X ± 2μX μY + μ2Y ]
=E [(X − μX )2 + (Y − μY )2 ± 2XY ∓ 2X μY ∓ 2Y μX ± 2μX μY ]
=E [(X − μX )2 ] + E [(Y − μY )2 ] ± 2E (XY ) ∓ 2μY E (X ) ∓ 2μX E (Y ) ± 2μX μY
=Var (X ) + Var (Y ) ± 2Cov (X , Y ).
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Outline
1 Introduction
3 Moments
4 Chebyshev’s Theorem
6 Product Moments
7 Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Conditional Expectation
Definition 51
If X is a discrete random variable and f (x|y ) is the value of the
conditional probability distribution of X given Y = y at x, the
conditional expectation of u(X ) given Y = y is
E [u(X )|y ] = u(x)f (x|y ).
x
Definition 52
If we let u(X ) = X in Definition 51, we obtain the conditional
mean of the random variable X given Y = y , which we denote by
μX |y = E (X |y ) = xf (x|y ).
x
Definition 53
Correspondingly, the conditional variance of X given Y = y is
Example 54
With reference to Example 46, find the conditional mean of X
given Y = 1.
2
4 3 3
E (X |1) = xf (x|1) = 0 · +1· +2·0= .
7 7 7
x=0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 55
Suppose that we have the following joint probability distribution
H
HH x
HH 1 2 3 4 h(y )
y H
1 0.01 0.07 0.09 0.03 0.20
2 0.20 0.05 0.25 0.50
3 0.09 0.03 0.06 0.12 0.30
g (x) 0.30 0.10 0.20 0.40 1
Solution. Since
f (x|y ) = f h(y
(x,y ) f (x,2) f (x,2)
) ⇒ f (x|2) = h(2) = 0.5 = 2f (x, 2), then we
have ⎧
⎪
⎪2f (1, 2) = 0.40, x = 1,
⎪
⎪
⎨2f (2, 2) = 0.00, x = 2,
f (x|2) =
⎪
⎪2f (3, 2) = 0.10, x = 3,
⎪
⎪
⎩2f (4, 2) = 0.50, x = 4.
4
E [X |Y = 2] =
2
x 2 f (x|2)
x=1
= 1 f (1|2) + 22 f (2|2) + 32 f (3|2) + 42 f (4|2)
2
4
E [X |Y = 2] = μX |2 = xf (x|2)
x=1
= 1f (1|2) + 2f (2|2) + 3f (3|2) + 4f (4|2)
= 1(0.40) + 2(0.00) + 3(0.10) + 4(0.50)
= 2.70
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
4
E [(X − μX |2 )2 |2] =σX2 |2 = (x − 2.7)2 f (x|2)
x=1
=(1 − 2.7)2 f (1|2) + (2 − 2.7)2 f (2|2)
+ (3 − 2.7)2 f (3|2) + (4 − 2.7)2 f (4|2)
=2.89(0.40) + 0.49(0.00) + 0.09(0.10) + 1.69(0.50)
=2.01
or simply
Thank You!!!