You are on page 1of 33

IE 231 - Chapter 4 - Mathematical Expectation

IE 231 - Chapter 4 - Mathematical Expectation

İstanbul Bilgi University, Faculty of Engineering and Natural Sciences


IE 231 - Chapter 4 - Mathematical Expectation

Table of Contents

Introduction

The Expected Value of a Random Variable

The Variance of a Random Variable

Covariance

Expected Values of Linear Combinations of Random Variables


IE 231 - Chapter 4 - Mathematical Expectation
Introduction

Mathematical Expectation
Example
A roulette wheel has 38 equally likely outcomes. A winning bet
placed on a single number pays 35 to 1 (35 times your bet, and
your bet is returned too). Then the expected value of the profit
resulting from a dollar bet is:
IE 231 - Chapter 4 - Mathematical Expectation
Introduction

Example
The expected value from the roll of a die is:

Note: The expected value may be unlikely or impossible.


IE 231 - Chapter 4 - Mathematical Expectation
The Expected Value of a Random Variable

The Expected Value of a Random Variable


Definition
Let X be a discrete r.v. and f (x) be its p.m.f. The expected value
of X is X
E (X ) = xf (x).
x

Let Y be a continuous r.v. and g (y ) be its p.d.f. The expected


value of Y is Z ∞
E (Y ) = yg (y )dy .
−∞
IE 231 - Chapter 4 - Mathematical Expectation
The Expected Value of a Random Variable

The expected value of a random variable is typically denoted by


the Greek letter µ.

µ = E (X )
Sometimes we use µX , µY , etc. in order to specify the random
variable.

µX = E (X )

µY = E (Y )
The expected value of a random variable is also referred to as the
mean of the random variable.
IE 231 - Chapter 4 - Mathematical Expectation
The Expected Value of a Random Variable

Note: Sometimes we are interested in not only the expected value


of a r.v. X , but also in the expected values of r.v.’s related to X .
Theorem
Let X be a discrete r.v. with p.m.f. f (x), and let g (X ) be a
function of X . Then
X
E [g (X )] = g (x)f (x).
x

Let Y be a continuous r.v. with p.d.f. g (y ), and let h(Y ) be a


function of Y . Then
Z ∞
E [h(Y )] = h(y )g (y )dy .
−∞
IE 231 - Chapter 4 - Mathematical Expectation
The Expected Value of a Random Variable

Example
Let X have p.m.f.

x/6 x = 1, 2, 3
f (x) = .
0 otherwise

E (X ) =

E (X 3 ) =
IE 231 - Chapter 4 - Mathematical Expectation
The Expected Value of a Random Variable

Example
Let X have p.d.f.

2(1 − x) 0 < x < 1
f (x) = .
0 otherwise

E (X ) =

E (X 2 ) =

E (6X + 3X 2 ) =
IE 231 - Chapter 4 - Mathematical Expectation
The Expected Value of a Random Variable

Theorem (Linearity)
If a and b are constants, then

E (aX + b) = aE (X ) + b.

Proof.

Corollary

E (aX ) = aE (X ), E (b) = b, where a and b are constants.


IE 231 - Chapter 4 - Mathematical Expectation
The Expected Value of a Random Variable

Theorem
If c1 , c2 , ..., cn are constants, then
" n # n
X X
E ci gi (X ) = ci E [gi (X )] .
i=1 i=1

Proof.
IE 231 - Chapter 4 - Mathematical Expectation
The Expected Value of a Random Variable

Example
A bowl contains 5 chips, 3 of which are marked $1, remaining are
marked $4.

A player is blindfolded, and draws two chips at random, without


replacement.

The player is paid the amount of the two chips he/she draws.

If it costs $4.75 to play this game, would you play?


IE 231 - Chapter 4 - Mathematical Expectation
The Expected Value of a Random Variable

Solution
IE 231 - Chapter 4 - Mathematical Expectation
The Expected Value of a Random Variable

Theorem
Let X and Y be two discrete r.v.’s and let f (x, y ) be their joint
p.m.f.
The expected value of g (X , Y ) is
XX
E [g (X , Y )] = g (x, y )f (x, y ).
x y

For continuous X and Y , we have


Z ∞Z ∞
E [g (X , Y )] = g (x, y )f (x, y )dxdy .
−∞ −∞

Note: Generalization to higher dimensions is straightforward.


IE 231 - Chapter 4 - Mathematical Expectation
The Expected Value of a Random Variable

Example
Let X and Y have joint p.d.f.

x +y 0<x <1 0<y <1
f (x, y ) = .
0 otherwise

Find E (XY 2 ).

Solution
IE 231 - Chapter 4 - Mathematical Expectation
The Variance of a Random Variable

The Variance of a Random Variable


Definition
The variance of a random variable X , denoted by σ 2 or Var (X ), is
defined as

σ 2 = Var (X ) = E [X − µ]2 ,
where µ = E (X ).

The positive square root of the variance, σ, is called the standard


deviation.
IE 231 - Chapter 4 - Mathematical Expectation
The Variance of a Random Variable

If X is a discrete random variable with mean µ,


X
σ 2 = Var (X ) = E [X − µ]2 = (x − µ)2 f (x).
x

If X is a continuous random variable with mean µ,


Z ∞
σ 2 = Var (X ) = E [X − µ]2 = (x − µ)2 f (x)dx.
−∞
IE 231 - Chapter 4 - Mathematical Expectation
The Variance of a Random Variable

Some p.d.f. figures here with different variances

It is sometimes easier to use an alternative formula for variance.


Theorem

σ 2 = E (X 2 ) − µ2 .

Proof.
IE 231 - Chapter 4 - Mathematical Expectation
The Variance of a Random Variable

Example
Let X have p.d.f.
1

2 (x + 1) −1 < x < 1
f (x) = .
0 otherwise

E(X)=

Var(X)=
IE 231 - Chapter 4 - Mathematical Expectation
The Variance of a Random Variable

Theorem
If X has variance σ 2 , then

Var (aX + b) = a2 σ 2 .

Proof.
IE 231 - Chapter 4 - Mathematical Expectation
Covariance

Covariance
Definition
The covariance of two random variables X and Y , denoted by
σXY , or Cov (X , Y ), is defined as

σX ,Y = Cov (X , Y ) = E [(X − µX )(Y − µY )].


IE 231 - Chapter 4 - Mathematical Expectation
Covariance

If X and Y are discrete random variables with means µX and µY ,


we have
XX
σXY = (x − µX )(y − µY )f (x, y ).
x y

If X and Y are continuous random variables with means µX and


µY , we have
Z ∞Z ∞
σXY = (x − µX )(y − µY )f (x, y )dxdy .
−∞ −∞
IE 231 - Chapter 4 - Mathematical Expectation
Covariance

Covariance is an indicative of the relationship between X and Y .


I If large values of X tend to be observed with large values of
Y , and small values of X tend to be observed with small
values of Y , then Cov (X , Y ) will be positive.
I In other words, if X > µX , then Y > µY is likely, thus
(X − µX )(Y − µY ) is positive. If X < µX , then Y < µY is
likely, thus (X − µX )(Y − µY ) is positive.
IE 231 - Chapter 4 - Mathematical Expectation
Covariance

I If large values of X tend to be observed with small values of


Y , and small values of X tend to be observed with large
values of Y , then Cov (X , Y ) will be negative.
I In other words, if X > µX , then Y < µY is likely, thus
(X − µX )(Y − µY ) is negative. If X < µX , then Y > µY is
likely, thus (X − µX )(Y − µY ) is negative.
Thus, the sign of Cov (X , Y ) gives information regarding the
relationship between X and Y .

Note: Covariance does not in itself give information about the


strength of the relationship.
IE 231 - Chapter 4 - Mathematical Expectation
Covariance

Cov(X,Y)>0

- +

+ -
IE 231 - Chapter 4 - Mathematical Expectation
Covariance

Note: When we compute covariance, we typically use the following:


Theorem
For r.v.’s X and Y , we have

Cov (X , Y ) = E (XY ) − E (X )E (Y ).

Proof.
IE 231 - Chapter 4 - Mathematical Expectation
Covariance

Definition
The correlation coefficient of X and Y is defined by

Cov (X , Y )
ρXY = .
σX σY

The correlation coefficient satisfies −1 ≤ ρXY ≤ 1. The values -1


and 1 indicate perfect linear relationship between X and Y .
IE 231 - Chapter 4 - Mathematical Expectation
Covariance

Example
Let r.v.’s X and Y have joint p.d.f.

x +y 0<x <1 0<y <1
f (x, y ) = .
0 otherwise

Compute ρXY .
IE 231 - Chapter 4 - Mathematical Expectation
Covariance

Solution
IE 231 - Chapter 4 - Mathematical Expectation
Covariance

Theorem
If X and Y are independent, then E (XY ) = E (X )E (Y ), and
Cov (X , Y ) = ρXY = 0.

Proof.
IE 231 - Chapter 4 - Mathematical Expectation
Covariance

Note:
If two r.v.’s are independent, then they are uncorrelated. The
converse is not true in general.
Example
X ∼ U[−1, 1], Y = X 2 → Cov (X , Y ) = 0.

Theorem
If X1 , X2 , ..., Xn are independent, then
E (X1 X2 · · · Xn ) = E (X1 )E (X2 ) · · · E (Xn ).
IE 231 - Chapter 4 - Mathematical Expectation
Expected Values of Linear Combinations of Random Variables

Expected Values of Linear Combinations of Random


Variables
Theorem
If X and Y are any two r.v.’s, and a and b are any two constants,
1) E (aX + bY ) = aE (X ) + bE (Y ),
2) Var (aX + bY ) = a2 Var (X ) + b 2 Var (Y ) + 2abCov (X , Y ).

If X and Y are independent,


3) Var (aX + bY ) = a2 Var (X ) + b 2 Var (Y ).
IE 231 - Chapter 4 - Mathematical Expectation
Expected Values of Linear Combinations of Random Variables

Proof.

You might also like