You are on page 1of 72

Chapter 5

Two Discrete
Introduction to Probability, Statistics and Random Variables
Joint Probability Mass
Function (PMF)
Random Processes Joint Cumulative
Distribution Function
(CDF)
Chapter 5: Joint Distributions: Two Random Variables Conditioning and
Independence
Functions of Two
Random Variables
Conditional
Expectation and
Conditional Variance

Two Continuous
Random Variables
Joint Probability
Density Function
(PDF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Hossein Pishro-Nik Independence
Functions of Two
University of Massachusetts, Amherst Continuous Random
Variables
Email: pishro@ecs.umass.edu
More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 1
Chapter 5
Joint Probability Mass Function (PMF)
Two Discrete
Random Variables
I We can define the joint probability mass function for Joint Probability Mass
Function (PMF)
Joint Cumulative
two random variables X and Y . Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
The joint probability mass function of two dis- Random Variables
Conditional
crete random variables X and Y is defined as Expectation and
Conditional Variance

Two Continuous
Random Variables
PXY (x, y ) = P(X = x, Y = y ). Joint Probability
Density Function
(PDF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
I We have Functions of Two
Continuous Random
Variables

More Topics of
PXY (x, y ) = P(X = x, Y = y ) Two Random
 Variables
= P (X = x) and (Y = y ) . Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 2
Chapter 5
Joint Probability Mass Function (PMF)
Two Discrete
Random Variables
Joint Probability Mass
I We can define the joint range for X and Y as Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
RXY = {(x, y )|PXY (x, y ) > 0}. Conditioning and
Independence
Functions of Two
Random Variables
I If RX = {x1 , x2 , ...} and RY = {y1 , y2 , ...}, then we can Conditional
Expectation and
Conditional Variance
write Two Continuous
Random Variables
Joint Probability
RXY ⊂ RX × RY Density Function
(PDF)
Joint Cumulative
= {(xi , yj )|xi ∈ RX , yj ∈ RY }. Distribution Function
(CDF)
Conditioning and
Independence
I We usually define RXY = RX × RY to simplify the Functions of Two
Continuous Random
Variables
analysis.
More Topics of
Two Random
I In this case, for some pairs (xi , yj ) in RX × RY , Variables
PXY (xi , yj ) might be zero. Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 3
Chapter 5
Joint Probability Mass Function (PMF)
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
I For two discrete random variables X and Y , we have Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
X Functions of Two
PXY (xi , yj ) = 1 Random Variables
Conditional
Expectation and
(xi ,yj )∈RXY Conditional Variance

Two Continuous
Random Variables
Joint Probability
Density Function
To find P (X , Y ) ∈ A for any set A ⊂ R2 , we have

I (PDF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
 X Functions of Two
P (X , Y ) ∈ A = PXY (xi , yj ) Continuous Random
Variables

(xi ,yj )∈(A∩RXY ) More Topics of


Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 4
Chapter 5
Joint Probability Mass Function (PMF)
Two Discrete
I The joint PMF contains all information about the Random Variables
Joint Probability Mass
distributions of X and Y . Function (PMF)
Joint Cumulative
I Using the law of total probability, we can obtain the Distribution Function
(CDF)
Conditioning and
PMF of X from it’s joint PMF with Y . Independence
Functions of Two
Random Variables
Conditional
PX (x) = P(X = x) Expectation and
Conditional Variance
X
Two Continuous
= P(X = x, Y = yj ) Random Variables
yj ∈RY Joint Probability
Density Function
X (PDF)
Joint Cumulative
= PXY (x, yj ). Distribution Function
(CDF)
yj ∈RY Conditioning and
Independence
Functions of Two
Continuous Random
I PX (x) the marginal PMF of X . Similarly, we can find Variables

More Topics of
the marginal PMF of Y as Two Random
Variables
X Covariance and
PY (Y ) = PXY (xi , y ). Correlation
Bivariate Normal
Distribution
xi ∈RX
www.probabilitycourse.com 5
Chapter 5
Joint Probability Mass Function (PMF)
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
I We thus have the marginal PMFs of X and Y from the Distribution Function
(CDF)
Conditioning and
joint distribution as Independence
Functions of Two
Random Variables
Conditional
Expectation and
Marginal PMFs of X and Y : Conditional Variance

Two Continuous
Random Variables
X Joint Probability
PX (x) = PXY (x, yj ), for any x ∈ RX Density Function
(PDF)
Joint Cumulative
yj ∈RY Distribution Function
X (CDF)
Conditioning and
PY (y ) = PXY (xi , y ), for any y ∈ RY Independence
Functions of Two
xi ∈RX Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 6
Chapter 5
Joint Cumulative Distribution Function (CDF)
Two Discrete
I For two random variables X and Y we can define the Random Variables
Joint Probability Mass
joint cumulative function as follows Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
The joint cumulative function of two random Functions of Two
Random Variables
variables X and Y is defined as Conditional
Expectation and
Conditional Variance

Two Continuous
FXY (x, y ) = P(X ≤ x, Y ≤ y ). Random Variables
Joint Probability
Density Function
(PDF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
I We can write Functions of Two
Continuous Random
Variables

FXY (x, y ) = P(X ≤ x, Y ≤ y ) More Topics of


Two Random
 Variables
= P (X ≤ x) and (Y ≤ y ) Covariance and
 Correlation
= P (X ≤ x) ∩ (Y ≤ y ) . Bivariate Normal
Distribution

www.probabilitycourse.com 7
Chapter 5
Joint Cumulative Distribution Function (CDF)
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
I The definition of joint CDF is general and applicable to Independence
Functions of Two
Random Variables
all types of random variables. Conditional
Expectation and
Conditional Variance
I Joint CDF refers to probability of an event thus,
Two Continuous
0 ≤ FXY (x, y ) ≤ 1. Random Variables
Joint Probability
I We also have Density Function
(PDF)
Joint Cumulative
1. FXY (∞, ∞) = 1. Distribution Function
(CDF)
2. FXY (−∞, y ) = 0, for any y. Conditioning and
Independence
3. FXY (x, −∞) = 0, for any x. Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 8
Chapter 5
Joint Cumulative Distribution Function (CDF)
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
• Joint Cumulative
• Distribution Function
(CDF)
Conditioning and
• • • Independence
• y
(x, y ) Functions of Two
• Random Variables
Conditional
• • Expectation and
• Conditional Variance
• • •
Two Continuous
x
Random Variables
• • Joint Probability
Density Function
• (PDF)
• Joint Cumulative
• • • (xi , yj )
Distribution Function
(CDF)
Conditioning and
• • • • Independence
Functions of Two
Continuous Random
Variables
Figure: FXY (x, y ) is the probability that (X , Y ) belongs to the More Topics of
shaded region. The dots are the pairs (xi , yj ) in RXY . Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 9
Chapter 5
Joint Cumulative Distribution Function (CDF)
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
I We can find the marginal CDFs FX (x) and FY (y ) from Conditioning and
Independence
the joint CDF. Functions of Two
Random Variables
Conditional
Expectation and
Conditional Variance
Marginal CDFs of X and Y : Two Continuous
Random Variables
Joint Probability
Density Function
FX (x) = FXY (x, ∞) = lim FXY (x, y ), for any x, (PDF)
Joint Cumulative
y →∞ Distribution Function
(CDF)
FY (y ) = FXY (∞, y ) = lim FXY (x, y ), for any y Conditioning and
Independence
x→∞ Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 10
Chapter 5
Joint Cumulative Distribution Function (CDF)
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
I We have the following lemma Functions of Two
Random Variables
Lemma: For two random variables X and Y , and real Conditional
Expectation and
numbers x1 ≤ x2 , y1 ≤ y2 , we have Conditional Variance

Two Continuous
Random Variables
P(x1 < X ≤ x2 , y1 < Y ≤ y2 ) = Joint Probability
Density Function
(PDF)
FXY (x2 , y2 ) − FXY (x1 , y2 ) − FXY (x2 , y1 ) Joint Cumulative
Distribution Function
(CDF)
+ FXY (x1 , y1 ). Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 11
Chapter 5
Joint Cumulative Distribution Function (CDF)
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
Conditional
Expectation and
I Let X ∼ Bernoulli(p) and Y ∼ Bernoulli(q) be Conditional Variance

Two Continuous
independent, where 0 < p, q < 1. Find the joint PMF Random Variables
Joint Probability
and joint CDF for X and Y . Density Function
(PDF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 12
Chapter 5
Conditioning and Independence
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
I We know that Random Variables
Conditional
P(A∩B) Expectation and
P(A|B) = P(B) , when P(B) > 0. Conditional Variance

Two Continuous
I For two random variables X and Y , we can write Random Variables
Joint Probability
Density Function
P(X ∈ C , Y ∈ D) (PDF)
P(X ∈ C |Y ∈ D) = , where C , D ⊂ R. Joint Cumulative
Distribution Function
P(Y ∈ D) (CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 13
Chapter 5
Conditional PMF and CDF
Two Discrete
I The conditional PMF of X given event A is defined as Random Variables
Joint Probability Mass
Function (PMF)
PX |A (xi ) = P(X = xi |A) Joint Cumulative
Distribution Function
(CDF)
P(X = xi and A) Conditioning and
= . Independence
P(A) Functions of Two
Random Variables
Conditional
Expectation and
Conditional Variance
For a discrete random variable X and event A, the Two Continuous
Random Variables
conditional PMF of X given A is defined as Joint Probability
Density Function
(PDF)

PX |A (xi ) = P(X = xi |A) Joint Cumulative


Distribution Function
(CDF)

P(X = xi and A) Conditioning and


Independence
= , for any xi ∈ RX . Functions of Two
P(A) Continuous Random
Variables

More Topics of
We define the conditional CDF of X given A Two Random
Variables
Covariance and
Correlation
FX |A (x) = P(X ≤ x|A). Bivariate Normal
Distribution

www.probabilitycourse.com 14
Chapter 5
Conditional PMF of X Given Y
Two Discrete
Random Variables
Joint Probability Mass
I We define the conditional PMF of X given Y Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
For discrete random variables X and Y , the con- Functions of Two
Random Variables
Conditional
ditional PMFs of X given Y and vice versa are Expectation and
Conditional Variance
defined as Two Continuous
Random Variables
PXY (xi , yj ) Joint Probability
PX |Y (xi |yj ) = , Density Function
(PDF)
PY (yj ) Joint Cumulative
Distribution Function
(CDF)
PXY (xi , yj ) Conditioning and
PY |X (yj |xi ) = Independence
PX (xi ) Functions of Two
Continuous Random
Variables

for any xi ∈ RX and yj ∈ RY . More Topics of


Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 15
Chapter 5
Independent Random Variables
Two Discrete
Random Variables
I We can define independent random variables using joint Joint Probability Mass
Function (PMF)
Joint Cumulative
PDFs and CDFs. Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
Two discrete random variables X and Y are inde- Conditional
Expectation and
pendent if Conditional Variance

Two Continuous
Random Variables
PXY (x, y ) = PX (x)PY (y ), for all x, y . Joint Probability
Density Function
(PDF)
Joint Cumulative
Distribution Function
Equivalently, X and Y are independent if (CDF)
Conditioning and
Independence
Functions of Two
FXY (x, y ) = FX (x)FY (y ), for all x, y . Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 16
Chapter 5
Independent Random Variables
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
I For independent random variables, the conditional PMF Joint Cumulative
Distribution Function
(CDF)
is equal to the marginal PMF. Conditioning and
Independence
Functions of Two
Random Variables
PX |Y (xi |yj ) = P(X = xi |Y = yj ) Conditional
Expectation and
Conditional Variance
PXY (xi , yj )
= Two Continuous
Random Variables
PY (yj ) Joint Probability
Density Function
PX (xi )PY (yj ) (PDF)
= Joint Cumulative
Distribution Function
PY (yj ) (CDF)
Conditioning and
Independence
= PX (xi ). Functions of Two
Continuous Random
Variables

I In other words, knowing the value of Y does not More Topics of


Two Random
provide any information about X . Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 17
Chapter 5
Independent Random Variables
Two Discrete
Random Variables
I Consider the set of points in the grid shown in the Joint Probability Mass
Function (PMF)

figure. These are the points in set G defined as Joint Cumulative


Distribution Function
(CDF)
Conditioning and
Independence
G = {(x, y )|x, y ∈ Z, |x| + |y | ≤ 2}. Functions of Two
Random Variables
Conditional
Expectation and
Conditional Variance
y
Two Continuous

(0, 2) Random Variables
Joint Probability
• • • Density Function
(-1,1) (0,1) (1,1) (PDF)
Joint Cumulative
• • • • • Distribution Function
(-2,0) (-1,0) (0,0) (1,0) (2,0) x (CDF)
Conditioning and
Independence
• • •
(-1,-1) (0,-1) (1,-1) Functions of Two
Continuous Random
Variables

(0,-2)
More Topics of
Two Random
Variables
Figure: Grid for the example Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 18
Chapter 5
Independent Random Variables
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
I Suppose that we pick a point (X , Y ) from this grid Conditional
Expectation and
completely at random. Thus, each point has a Conditional Variance
1
probability of 13 of being chosen. Two Continuous
Random Variables
1. Find the joint and marginal PMFs of X and Y . Joint Probability
Density Function
2. Find the conditional PMF of X given Y = 1. (PDF)
Joint Cumulative
3. Are X and Y independent? Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 19
Chapter 5
Conditional Expectation
Two Discrete
Random Variables
I Conditional expectation is similar to ordinary Joint Probability Mass
Function (PMF)
Joint Cumulative
expectation. Distribution Function
(CDF)
Conditioning and
I We can compute the conditional expectation given an Independence
Functions of Two
event A or observed value of random variable Y . Random Variables
Conditional
Expectation and
Conditional Variance

Two Continuous
Conditional Expectation of X : Random Variables
Joint Probability
Density Function
(PDF)
X Joint Cumulative
Distribution Function
E [X |A] = xi PX |A (xi ) (CDF)
Conditioning and
xi ∈RX Independence
Functions of Two
X Continuous Random
E [X |Y = yj ] = xi PX |Y (xi |yj ) Variables

More Topics of
xi ∈RX Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 20
Chapter 5
Conditional Expectation
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
Conditional
I Let X and Y be the same as in previous example. Expectation and
Conditional Variance

1. Find E [X |Y = 1]. Two Continuous


Random Variables
2. Find E [X | −
1 < Y < 2].  Joint Probability
Density Function
3. Find E |X | − 1 < Y < 2 . (PDF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 21
Chapter 5
Law of Total Probability
Two Discrete
Random Variables
Joint Probability Mass
Law of Total Probability: Function (PMF)
Joint Cumulative
Distribution Function
X (CDF)
P(X ∈ A) = P(X ∈ A|Y = yj )PY (yj ), for any set A. Conditioning and
Independence
Functions of Two
yj ∈RY Random Variables
Conditional
Expectation and
Conditional Variance
Law of Total Expectation:
Two Continuous
Random Variables
1. If B1 , B2 , B3 , ... is a partition of the sample space Joint Probability
Density Function
S, (PDF)
Joint Cumulative
X Distribution Function
(CDF)
EX = E [X |Bi ]P(Bi ). Conditioning and
Independence
i Functions of Two
Continuous Random
Variables

2. For a random variable X and a discrete random More Topics of


Two Random
variable Y , Variables
Covariance and
X Correlation
Bivariate Normal
EX = E [X |Y = yj ]PY (yj ). Distribution

www.probabilitycourse.com yj ∈RY 22
Chapter 5
Functions of Two Random Variables
Two Discrete
I Say we have Z = g (X , Y ) where X , Y are discrete Random Variables

random variables and g : R2 7→ R. Joint Probability Mass


Function (PMF)
Joint Cumulative
I The PMF of Z is given by where Distribution Function
(CDF)

Az = {(xi , yj ) ∈ RXY : g (xi , yj ) = z} Conditioning and


Independence
Functions of Two
Random Variables
PZ (z) = P(g (X , Y ) = z) Conditional
Expectation and
X Conditional Variance

= PXY (xi , yj ). Two Continuous


Random Variables
(xi ,yj )∈Az Joint Probability
Density Function
(PDF)
Joint Cumulative
I We can directly use LOTUS to find E [g (X , Y )] Distribution Function
(CDF)
Conditioning and
Law of the unconscious statistician (LOTUS) for Independence
Functions of Two
Continuous Random
two discrete random variables: Variables

More Topics of
X Two Random
E [g (X , Y )] = g (xi , yj )PXY (xi , yj ) Variables
Covariance and
(xi ,yj )∈RXY Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 23
Chapter 5
Functions of Two Random Variables
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
Conditional
I Let X ∼ Geometric(p). Find EX by conditioning on the Expectation and
Conditional Variance

result of the first ’coin toss’. Two Continuous


Random Variables
I Let X and Y be two independent Geometric(p) random Joint Probability
Density Function
(PDF)
variables. Also let Z = X − Y . Find the PMF of Z . Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 24
Chapter 5
Conditional Expectation as a Function of a
Random Variable Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
I Conditional expectation of X given Y = y is Joint Cumulative
Distribution Function
X (CDF)
Conditioning and
E [X |Y = y ] = xi PX |Y (xi |y ). Independence
Functions of Two
Random Variables
xi ∈RX
Conditional
Expectation and
Conditional Variance
I We can think of g (y ) = E [X |Y = y ] as a function of Two Continuous
the value of random variable Y . This gives us Random Variables
Joint Probability
Density Function
g (Y ) = E [X |Y ] (PDF)
Joint Cumulative
Distribution Function
I Thus if Y is a random variable with range (CDF)
Conditioning and
RY = {y1 , y2 , ..}, then E [X |Y ] is also a random variable Independence
Functions of Two
Continuous Random
 Variables


 E [X |Y = y1 ] with prob. P(Y = y1 ) More Topics of
Two Random
E [X |Y = y2 ] with prob. P(Y = y2 )

Variables
E [X |Y ] =

 ... ... Covariance and
Correlation
Bivariate Normal
... ...

Distribution

www.probabilitycourse.com 25
Chapter 5
Conditional Expectation as a Function of a
Random Variable Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
I Let X = aY + b. Then we have Random Variables
Conditional
E [X |Y ] = aY + b Expectation and
Conditional Variance

Two Continuous
I Let X and Y be two random variables and g and h be Random Variables
two functions. Then Joint Probability
Density Function
(PDF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
E [g (X )h(Y )|X ] = g (X )E [h(Y )|X ] Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 26
Chapter 5
Conditional Expectation as a Function of a
Random Variable Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
I Consider two random variables X and Y with joint Conditioning and
Independence
PMF given in the table. Functions of Two
Random Variables
Conditional
Expectation and
Conditional Variance
Table: Joint PMF of X and Y
Two Continuous
Random Variables
Y =0 Y =1 Joint Probability
Density Function
(PDF)
Joint Cumulative
1 2
X =0 5 5
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables
2
X =1 5 0 More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 27
Chapter 5
Conditional Expectation as a Function of a
Random Variable Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
I Let Z = E [X |Y ]. Conditional
Expectation and
Conditional Variance
1. Find the Marginal PMFs of X and Y . Two Continuous
2. Find the conditional PMF of X given Y = 0 and Random Variables
Joint Probability
Y = 1, i.e., find PX |Y (x|0) and PX |Y (x|1). Density Function
(PDF)
3. Find the PMF of Z . Joint Cumulative
Distribution Function
4. Find EZ , and check that EZ = EX . (CDF)
Conditioning and
Independence
5. Find Var(Z ). Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 28
Chapter 5
Law of Iterated Expectations
Two Discrete
Random Variables
I Assuming g (Y ) = E [X |Y ], from the law of total Joint Probability Mass
Function (PMF)
probability we have Joint Cumulative
Distribution Function
(CDF)
X Conditioning and
E [X ] = E [X |Y = yj ]PY (yj ) Independence
Functions of Two
Random Variables
yj ∈RY Conditional
X Expectation and
Conditional Variance
= g (yj )PY (yj ) Two Continuous
yj ∈RY Random Variables
Joint Probability
Density Function
= E [g (Y )] by LOTUS (PDF)
Joint Cumulative
Distribution Function
= E [E [X |Y ]]. (CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Law of Iterated Expectations: E [X ] = E [E [X |Y ]] Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 29
Chapter 5
Expectation for Independent Random Variables
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
If X and Y are independent random variables, then Conditional
Expectation and
Conditional Variance
1. E [X |Y ] = EX ; Two Continuous
Random Variables
2. E [g (X )|Y ] = E [g (X )]; Joint Probability
Density Function
3. E [XY ] = EXEY ; the converse is not true. (PDF)
Joint Cumulative
Distribution Function
4. E [g (X )h(Y )] = E [g (X )]E [h(Y )]. (CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 30
Chapter 5
Conditional Variance
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
I Conditional variance of X , Var (X |Y = y ) is the Conditioning and
Independence

variance of X in the conditional space Y = y . Functions of Two


Random Variables
Conditional
I Let µX |Y (y ) = E [X |Y = y ]. Expectation and
Conditional Variance

Two Continuous
Random Variables
Var(X |Y = y ) = E (X − µX |Y (y ))2 |Y = y
 
Joint Probability
Density Function
X 2 (PDF)
= xi − µX |Y (y ) PX |Y (xi ) Joint Cumulative
Distribution Function
(CDF)
xi ∈RX Conditioning and
Independence
= E X |Y = y − µX |Y (y )2 .
 2 
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 31
Chapter 5
Conditional Variance
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Law of Total Variance: Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Var(X ) = E [Var(X |Y )] + Var(E [X |Y ]) Random Variables
Conditional
Expectation and
Conditional Variance

Two Continuous
Random Variables
Joint Probability
I Let X1 , X2 , .., Xi be independent random variables and Density Function
(PDF)
Joint Cumulative
also independent of N. Let Distribution Function
(CDF)
N
X Conditioning and
Independence

Y = Xi . Functions of Two
Continuous Random
Variables
i=1
More Topics of
I We have Two Random
Variables
E [Y ] = E [X ]E [N] Covariance and
Correlation
Bivariate Normal

Var (Y ) = E [N]Var (X ) + (E [X ])2 Var (N)


Distribution

www.probabilitycourse.com 32
Chapter 5
Conditional Variance
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
I Let X , Y , and Z = E [X |Y ] be as in the previous Conditional
Expectation and
Conditional Variance
example. Let also V =Var(X |Y ).
Two Continuous
1. Find the PMF of V . Random Variables
Joint Probability
2. Find EV . Density Function
(PDF)
3. Check that Var(X ) = E (V )+Var(Z ). Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 33
Chapter 5
Joint Probability Density Function (PDF)
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Definition: Two random variables X and Y are jointly Independence
Functions of Two
continuous if there exists a nonnegative function fXY : Random Variables
Conditional
R2 → R, such that, for any set A ∈ R2 , we have Expectation and
Conditional Variance

ZZ Two Continuous
 Random Variables
P (X , Y ) ∈ A = fXY (x, y )dxdy Joint Probability
Density Function
(PDF)
A Joint Cumulative
Distribution Function
(CDF)
Conditioning and
The function fXY (x, y ) is called the joint probability Independence
Functions of Two
density function (PDF) of X and Y . Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 34
Chapter 5
Joint Probability Density Function (PDF)
Two Discrete
Random Variables
Joint Probability Mass
I The domain of fXY (x, y ) is the entire R2 . We may Function (PMF)
Joint Cumulative
define the range of (X , Y ) as Distribution Function
(CDF)
Conditioning and
Independence

RXY = {(x, y )|fX ,Y (x, y ) > 0}. Functions of Two


Random Variables
Conditional
Expectation and
Conditional Variance
I We have Two Continuous
Random Variables
Joint Probability
Z ∞ Z ∞ Density Function
(PDF)
fXY (x, y )dxdy = 1 Joint Cumulative
Distribution Function
−∞ −∞ (CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables
I For small positive δx and δy , we can write
More Topics of
Two Random
P(x < X ≤ x + δx , y ≤ Y ≤ y + δy ) ≈ fXY (x, y )δx δy . Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 35
Chapter 5
Joint Probability Density Function (PDF)
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
I Analogous to the discrete case, we can obtain marginal (CDF)
Conditioning and
PDFs of X and Y from their joint PDF. Independence
Functions of Two
Random Variables
Conditional
Marginal PDFs Expectation and
Conditional Variance

Two Continuous
Random Variables
Z ∞ Joint Probability
Density Function
fX (x) = fXY (x, y )dy , for all x, (PDF)
Joint Cumulative
Z−∞

Distribution Function
(CDF)
Conditioning and
fY (y ) = fXY (x, y )dx, for all y . Independence
Functions of Two
−∞ Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 36
Chapter 5
Joint Probability Density Function (PDF)
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
I Let X and Y be two jointly continuous random Conditioning and
Independence
variables with joint PDF Functions of Two
Random Variables
Conditional
Expectation and

 x + cy 2

0 ≤ x, y ≤ 1 Conditional Variance

Two Continuous
fXY (x, y ) = Random Variables
Joint Probability
0 otherwise

Density Function
(PDF)
Joint Cumulative
Distribution Function
(CDF)
1. Find the constant c. Conditioning and
Independence
2. Find P(0 ≤ X ≤ 12 , 0 ≤ Y ≤ 12 ). Functions of Two
Continuous Random
Variables
3. Find the marginal PDFs fX (x) and fY (y ).
More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 37
Chapter 5
Joint Cumulative Distribution Function (CDF)
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
The joint cumulative function of two random vari- Joint Cumulative
Distribution Function
ables X and Y is defined as (CDF)
Conditioning and
Independence
Functions of Two
Random Variables
FXY (x, y ) = P(X ≤ x, Y ≤ y ). Conditional
Expectation and
Conditional Variance

The joint CDF satisfies the following properties: Two Continuous


Random Variables
1. FX (x) = FXY (x, ∞), for any x (marginal CDF of Joint Probability
Density Function
(PDF)
X ); Joint Cumulative
Distribution Function
(CDF)
2. FY (y ) = FXY (∞, y ), for any y (marginal CDF of Conditioning and
Independence
Y ); Functions of Two
Continuous Random
Variables
3. FXY (∞, ∞) = 1; More Topics of
Two Random
4. FXY (−∞, y ) = FXY (x, −∞) = 0; Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 38
Chapter 5
Joint Cumulative Distribution Function (CDF)
Two Discrete
Random Variables
5. P(x1 < X ≤ x2 , y1 < Y ≤ y2 ) = Joint Probability Mass
Function (PMF)

FXY (x2 , y2 ) − FXY (x1 , y2 ) − FXY (x2 , y1 ) + Joint Cumulative


Distribution Function
(CDF)
FXY (x1 , y1 ); Conditioning and
Independence
Functions of Two
6. if X and Y are independent, then Random Variables
Conditional
FXY (x, y ) = FX (x)FY (y ). Expectation and
Conditional Variance

Two Continuous
Random Variables
We also have Joint Probability
Density Function
(PDF)
Joint Cumulative
Distribution Function
(CDF)
Z y Z x Conditioning and
FXY (x, y ) = fXY (u, v )dudv Independence
Functions of Two
−∞ −∞ Continuous Random
Variables

More Topics of
Two Random
∂2 Variables
fXY (x, y ) = FXY (x, y ) Covariance and
Correlation
∂x∂y Bivariate Normal
Distribution

www.probabilitycourse.com 39
Chapter 5
Joint Cumulative Distribution Function (CDF)
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
I Let X and Y be two independent Uniform(0, 1) random Functions of Two
Random Variables
variables. Find FXY (x, y ). Conditional
Expectation and
Conditional Variance
I Find the joint CDF for X and Y given Two Continuous
Random Variables

 x + cy 2
 Joint Probability
0 ≤ x, y ≤ 1 Density Function
(PDF)
fXY (x, y ) = Joint Cumulative
Distribution Function
(CDF)
0 otherwise

Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 40
Chapter 5
Conditioning and Independence
Two Discrete
I For two random variables X and Y , we can write Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
P(X ∈ C , Y ∈ D) Distribution Function
P(X ∈ C |Y ∈ D) = , where C , D ⊂ R. (CDF)

P(Y ∈ D) Conditioning and


Independence
Functions of Two
Random Variables
Conditional
Expectation and
If X is a continuous random variable, and A is the event Conditional Variance

Two Continuous
that a ≤ X ≤ b (where possibly b = ∞ or a = −∞), Random Variables
then Joint Probability
Density Function
(PDF)
 Joint Cumulative

 1 x >b Distribution Function
(CDF)
 Conditioning and
Independence


Functions of Two

FX (x)−FX (a)
FX |A (x) = FX (b)−FX (a) a≤x <b Continuous Random
Variables

 More Topics of

 Two Random

0 x <a Variables

Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 41
Chapter 5
Conditioning and Independence
Two Discrete
 Random Variables
fX (x)

 P(A) a≤x <b Joint Probability Mass
Function (PMF)
Joint Cumulative
fX |A (x) = Distribution Function
(CDF)

 0 Conditioning and
otherwise Independence
Functions of Two
Random Variables
Conditional
Expectation and
Conditional Variance
I For a random variable X and an event A, we have the Two Continuous
following Random Variables
Joint Probability
Density Function
(PDF)
Joint Cumulative
Distribution Function
Z ∞ (CDF)
Conditioning and
E [X |A] = xfX |A (x)dx Independence
Functions of Two
−∞ Continuous Random
Z ∞ Variables

E [g (X )|A] = g (x)fX |A (x)dx More Topics of


Two Random
−∞ Variables

Var(X |A) = E [X |A] − (E [X |A])2


2 Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 42
Chapter 5
Conditioning and Independence
Two Discrete
Random Variables
Joint Probability Mass
I Let X ∼ Exponential(1). Function (PMF)
Joint Cumulative
Distribution Function
1. Find the conditional PDF and CDF of X given X > 1. (CDF)
Conditioning and
2. Find E [X |X > 1]. Independence
Functions of Two
3. Find Var (X |X > 1). Random Variables
Conditional
Expectation and
I Let X and Y be 2 jointly continuous random variables Conditional Variance

with Two Continuous


Random Variables
 2 y 2 xy Joint Probability

 x4 + 4 + 6
Density Function
0 ≤ x ≤ 1, 0 ≤ y ≤ 2 (PDF)
Joint Cumulative
fXY (x, y ) = Distribution Function
(CDF)
Conditioning and

0 otherwise Independence
Functions of Two
Continuous Random
Variables
For 0 ≤ y ≤ 2, find More Topics of
(a) the conditional PDF of X given Y = y ; Two Random
Variables
(b) P(X < 21 |Y = y ). Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 43
Chapter 5
Conditioning by another variable
Two Discrete
For two jointly continuous random variables X and Y , Random Variables
Joint Probability Mass
we can define the following conditional concepts: Function (PMF)
Joint Cumulative
Distribution Function
1. The conditional PDF of X given Y = y : (CDF)
Conditioning and
Independence
Functions of Two
fXY (x, y ) Random Variables

fX |Y (x|y ) = Conditional
Expectation and
fY (y ) Conditional Variance

Two Continuous
Random Variables
2. The conditional probability that X ∈ A given Joint Probability
Density Function
Y = y: (PDF)
Joint Cumulative
Distribution Function
Z (CDF)
Conditioning and
P(X ∈ A|Y = y ) = fX |Y (x|y )dx Independence
Functions of Two
A Continuous Random
Variables

3. The conditional CDF of X given Y = y : More Topics of


Two Random
Variables
Z x Covariance and
Correlation
FX |Y (x|y ) = P(X ≤ x|Y = y ) = fX |Y (x|y )dx Bivariate Normal
Distribution
−∞
www.probabilitycourse.com 44
Chapter 5
Conditioning by another variable
Two Discrete
Random Variables
For two jointly continuous random variables X and Y , Joint Probability Mass
Function (PMF)
Joint Cumulative
we have: Distribution Function
(CDF)
Conditioning and
1. Expected value of X given Y = y Independence
Functions of Two
Z ∞ Random Variables
Conditional
Expectation and
E [X |Y = y ] = xfX |Y (x|y )dx Conditional Variance

−∞ Two Continuous
Random Variables
Joint Probability
2. Conditional LOTUS Density Function
(PDF)
Joint Cumulative
Z ∞ Distribution Function
(CDF)
E [g (X )|Y = y ] = g (x)fX |Y (x|y )dx Conditioning and
Independence
−∞ Functions of Two
Continuous Random
Variables
3. Conditional variance of X given Y = y More Topics of
Two Random
Variables
Var (X |Y = y ) = E [X 2 |Y = y ] − (E [X |Y = y ])2 Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 45
Chapter 5
Independent Random Variables
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
I Two continuous random variables X and Y are Joint Cumulative
Distribution Function
(CDF)
independent if Conditioning and
Independence
Functions of Two
Random Variables
fXY (x, y ) = fX (x)fY (y ), for all x, y . Conditional
Expectation and
Conditional Variance

Two Continuous
Equivalently, X and Y are independent if Random Variables
Joint Probability
Density Function
FXY (x, y ) = FX (x)FY (y ), for all x, y . (PDF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
I If X and Y are independent, we have Independence
Functions of Two
Continuous Random
Variables
E [XY ] = EXEY , More Topics of
Two Random
E [g (X )h(Y )] = E [g (X )]E [h(Y )]. Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 46
Chapter 5
Conditioning and Independence
Two Discrete
I Consider the unit disc Random Variables
Joint Probability Mass
Function (PMF)
2 2 Joint Cumulative
D = {(x, y )|x + y ≤ 1}. Distribution Function
(CDF)
Conditioning and
Independence
Suppose that we choose a point (X , Y ) uniformly at Functions of Two
Random Variables
Conditional
random in D. That is, the joint PDF of X and Y is Expectation and
Conditional Variance
given by Two Continuous
Random Variables
Joint Probability

 c (x, y ) ∈ D Density Function
(PDF)
fXY (x, y ) = Joint Cumulative
Distribution Function
(CDF)
0 otherwise

Conditioning and
Independence
Functions of Two
Continuous Random
Variables
(a) Find the constant c. More Topics of
(b) Find the marginal PDFs fX (x) and fY (y ). Two Random
Variables
(c) Find the conditional PDF of X given Y = y , where Covariance and
Correlation
−1 ≤ y ≤ 1. Bivariate Normal
Distribution
(d) Are X and Y independent?
www.probabilitycourse.com 47
Chapter 5
Law of Total Probability
Two Discrete
Random Variables
Joint Probability Mass
Law of Total Probability: Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Z ∞ Independence
Functions of Two
P(A) = P(A|X = x)fX (x) dx Random Variables
Conditional
−∞ Expectation and
Conditional Variance

Two Continuous
Random Variables
Joint Probability
Density Function
(PDF)
Law of Total Expectation (Iterated Joint Cumulative
Distribution Function
(CDF)
Expectation): Conditioning and
Independence
Functions of Two
Continuous Random
Variables
Z ∞
More Topics of
E [Y ] = E [Y |X = x]fX (x) dx Two Random
Variables
−∞
Covariance and
= E [E [Y |X ]] Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 48
Chapter 5
Law of Total Probability
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
Conditional
Expectation and
Law of Total Variance: Conditional Variance

Two Continuous
Random Variables
Joint Probability
Density Function
Var(Y ) = E [Var(Y |X )] + Var(E [Y |X ]) (PDF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 49
Chapter 5
Law of Total Probability
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
I Let X and Y be two independent Uniform(0, 1) random Random Variables
Conditional
variables. Find P(X 3 + Y > 1). Expectation and
Conditional Variance

I Suppose X ∼ Uniform(1, 2) and given X = x, Y is an Two Continuous


Random Variables
exponential random variables with parameter λ = x, so Joint Probability
Density Function
(PDF)
we can write Joint Cumulative
Distribution Function
(CDF)
Y |X = x ∼ Exponential(x). Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 50
Chapter 5
Functions of Two Continuous Random Variables
Two Discrete
Random Variables
LOTUS for two continuous random variables Joint Probability Mass
Function (PMF)
Z ∞Z ∞ Joint Cumulative
Distribution Function
E [g (X , Y )] = g (x, y )fXY (x, y ) dxdy (CDF)
Conditioning and
−∞ −∞ Independence
Functions of Two
Random Variables
Conditional
Expectation and
Conditional Variance

Two Continuous
I If Z = g (X , Y ), we can write Random Variables
Joint Probability
Density Function
(PDF)
FZ (z) = P(z ≤ z) Joint Cumulative
Distribution Function
(CDF)
= P(g (X , Y ) ≤ z) Conditioning and
Independence
ZZ Functions of Two
Continuous Random
= fXY (x, y ) dxdy , Variables

More Topics of
D Two Random
Variables
Covariance and
where D = {(x, y )|g (x, y ) < z}. Correlation
Bivariate Normal
Distribution
I To find the PDF of Z , we differentiate FZ (z).
www.probabilitycourse.com 51
Chapter 5
Functions of Two Continuous Random Variables
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
I Let X and Y be two independent Uniform(0, 1) random Functions of Two
Random Variables
variables, and Z = XY . Find the CDF and PDF of Z . Conditional
Expectation and
Conditional Variance
I Let X and Y be two independent standard normal Two Continuous
Random Variables
random variables. Let also Z = 2X − Y and Joint Probability
Density Function
W = −X + Y . Find fZW (z, w ). (PDF)
Joint Cumulative
Distribution Function
I Let X and Y be two random variables with join PDF (CDF)
Conditioning and
fXY (x, y ). Let Z = X + Y . Find fZ (z). Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 52
Chapter 5
The Method of Transformations
Two Discrete
Theorem: Let X and Y be two jointly continu- Random Variables
Joint Probability Mass
ous random variables. Let (Z , W ) = g (X , Y ) = Function (PMF)
Joint Cumulative

(g1 (X , Y ), g2 (X , Y )), where g : R2 7→ R2 is a con-


Distribution Function
(CDF)
Conditioning and
tinuous one-to-one (invertible) function with continu- Independence
Functions of Two
ous partial derivatives. Let h = g −1 , i.e., (X , Y ) = Random Variables
Conditional
Expectation and
h(Z , W ) = (h1 (Z , W ), h2 (Z , W )). Then Z and W are Conditional Variance

Two Continuous
jointly continuous and their joint PDF, fZW (z, w ), for Random Variables
(z, w ) ∈ RZW is given by Joint Probability
Density Function
(PDF)
Joint Cumulative
Distribution Function
fZW (z, w ) = fXY (h1 (z, w ), h2 (z, w ))|J|, (CDF)
Conditioning and
Independence
Functions of Two
where J is the Jacobian of h defined by Continuous Random
Variables

 ∂h1 ∂h1  More Topics of


Two Random
∂z ∂w
 = ∂h1 · ∂h2 − ∂h2 · ∂h1 .
Variables
J = det  Covariance and

∂h2 ∂h2 ∂z ∂w ∂z ∂w Correlation


Bivariate Normal
∂z ∂w Distribution

www.probabilitycourse.com 53
Chapter 5
The Method of Transformations
Two Discrete
Random Variables
I From the method of transformations we get Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
If X and Y are two jointly continuous random variables Functions of Two
Random Variables
and Z = X + Y , then Conditional
Expectation and
Conditional Variance
Z ∞ Z ∞
Two Continuous
fZ (z) = fXY (w , z − w )dw = fXY (z − w , w )dw . Random Variables
−∞ −∞ Joint Probability
Density Function
(PDF)
Joint Cumulative
If X and Y are also independent, then Distribution Function
(CDF)
Conditioning and
Independence
fZ (z) = fX (z) ∗ fY (z) Functions of Two
Continuous Random
Z ∞ Variables
Z ∞
More Topics of
= fX (w )fY (z − w )dw = fY (w )fX (z − w )dw . Two Random
−∞ −∞ Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 54
Chapter 5
The Method of Transformations
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
Theorem: If X ∼ N(µX , σX2 ) and Y ∼ N(µY , σY2 ) are Conditional
Expectation and
Conditional Variance
independent, then
Two Continuous
  Random Variables
Joint Probability
2 2
X + Y ∼ N µX + µY , σX + σY . Density Function
(PDF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 55
Chapter 5
Covariance and Correlation
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
I The covariance between X and Y gives information Conditioning and
Independence
about how X and Y are statistically related. Functions of Two
Random Variables
Conditional
Expectation and
Conditional Variance

Two Continuous
The covariance between X and Y is defined as Random Variables
Joint Probability
Density Function
  (PDF)
Cov(X , Y ) = E (X − EX )(Y − EY ) = E [XY ] − (EX )(EY ). Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 56
Chapter 5
Covariance and Correlation
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
I If the covariance is positive, we conclude that X − EX Independence
Functions of Two
and Y − EY tend to have the same signs. Random Variables
Conditional
Expectation and
I If X is less than EX , it is more likely that Y is also less Conditional Variance

Two Continuous
than EY (and vice versa). Random Variables
Joint Probability
I If covariance is negative, X − EX and Y − EY tend to Density Function
(PDF)
Joint Cumulative
have the opposite signs. Distribution Function
(CDF)
Conditioning and
I If X is less than EX , it is more likely that Y is greater Independence
Functions of Two
than EY and vice versa. Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 57
Chapter 5
Covariance and Correlation
Two Discrete
Random Variables
The covariance has the following properties. Joint Probability Mass
Function (PMF)
Joint Cumulative
1. Cov(X , X ) = Var(X ) Distribution Function
(CDF)
Conditioning and
2. If X and Y are independent then Cov(X , Y ) = 0. Independence
Functions of Two
Random Variables
3. Cov(X , Y ) = Cov(Y , X ) Conditional
Expectation and
Conditional Variance
4. Cov(aX , Y ) = aCov(X , Y ) Two Continuous
Random Variables
5. Cov(X + c, Y ) = Cov(X , Y ) Joint Probability
Density Function
(PDF)

6. Cov(X + Y , Z ) = Cov(X , Z ) + Cov(Y , Z ) Joint Cumulative


Distribution Function
(CDF)
Conditioning and
7. More generally, Independence
Functions of Two
Continuous Random
  Variables
m
X n
X m X
X n More Topics of
Cov  ai Xi , bj Yj =
 ai bj Cov(Xi , Yj ). Two Random
Variables
i=1 j=1 i=1 j=1 Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 58
Chapter 5
Covariance and Correlation
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
I Suppose X ∼ Uniform(1, 2) and given X = x, Y is Independence
Functions of Two
Random Variables
exponential with parameter λ = x. Find Cov (X , Y ). Conditional
Expectation and
Conditional Variance
I Let X and Y be two independent N(0, 1) random
Two Continuous
variables and Random Variables
Joint Probability
Z = 1 + X + XY 2 Density Function
(PDF)
Joint Cumulative
Distribution Function
W =1+X (CDF)
Conditioning and
Independence
Find Cov (Z , W ). Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 59
Chapter 5
Variance of a Sum
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
I If Z = X + Y , Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Var(Z ) = Cov(Z , Z ) Independence
Functions of Two
Random Variables
= Cov(X + Y , X + Y ) Conditional
Expectation and
Conditional Variance
= Var(X ) + Var(Y ) + 2Cov(X , Y ). Two Continuous
Random Variables
I More generally, for a, b ∈ R, we conclude: Joint Probability
Density Function
(PDF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Var(aX + bY ) = a2 Var(X ) + b 2 Var(Y ) Independence
Functions of Two
Continuous Random
Variables
+ 2abCov(X , Y )
More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 60
Chapter 5
Correlation Coefficient
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
I The correlation coefficient denoted by ρXY is Independence
Functions of Two
obtained by normalizing the covariance. Random Variables
Conditional
Expectation and
Conditional Variance

Two Continuous
Random Variables
Joint Probability
Cov(X , Y ) Cov(X , Y ) Density Function
ρXY = ρ(X , Y ) = p = (PDF)

Var(X) Var(Y) σX σY Joint Cumulative


Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 61
Chapter 5
Correlation Coefficient
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
Properties of the correlation coefficient: Conditional
Expectation and
Conditional Variance
1. −1 ≤ ρ(X , Y ) ≤ 1;
Two Continuous
2. If ρ(X , Y ) = 1, then Y = aX + b, where a > 0; Random Variables
Joint Probability
Density Function
3. If ρ(X , Y ) = −1, then Y = aX + b, where a < 0; (PDF)
Joint Cumulative
Distribution Function
4. ρ(aX + b, cY + d) = ρ(X , Y ) for a, b > 0. (CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 62
Chapter 5
Correlation Coefficient
Two Discrete
I Consider two random variables X and Y : Random Variables
Joint Probability Mass
I If ρ(X , Y ) = 0, X and Y are uncorrelated. Function (PMF)
Joint Cumulative
I If ρ(X , Y ) > 0, X and Y are positively correlated. Distribution Function
(CDF)
I If ρ(X , Y ) < 0, X and Y are negatively correlated. Conditioning and
Independence
Functions of Two
Random Variables
Conditional
If X and Y are uncorrelated, then Expectation and
Conditional Variance

Two Continuous
Random Variables
Var(X + Y ) = Var(X ) + Var(Y ). Joint Probability
Density Function
(PDF)
Joint Cumulative
More generally, if X1 , X2 , ..., Xn are pairwise uncorre- Distribution Function
(CDF)
lated, i.e., ρ(Xi , Xj ) = 0 when i 6= j, then Conditioning and
Independence
Functions of Two
Continuous Random
Variables
Var(X1 + X2 + ... + Xn ) =Var(X1 ) + Var(X2 ) + ...
More Topics of
+ Var(Xn ). Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 63
Chapter 5
Correlation Coefficient
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
Conditional
I Suppose we chose a point (X , Y ) uniformly at random Expectation and
Conditional Variance

in the unit disc Two Continuous


Random Variables
D= {(x, y )|x 2 + y2 ≤ 1}. Joint Probability
Density Function
(PDF)
Are X and Y uncorrelated? Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 64
Chapter 5
Bivariate Normal Distribution
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
I Normal distribution is very important in probability Conditioning and
Independence
theory and shows up in many different applications. Functions of Two
Random Variables
Conditional
I We will discuss about two or more normal random Expectation and
Conditional Variance

variables here. Two Continuous


Random Variables
Joint Probability
Density Function
(PDF)
Definition: Two random variables X and Y are said Joint Cumulative
Distribution Function
to be bivariate normal, or jointly normal, if aX + bY (CDF)
Conditioning and
has a normal distribution for all a, b ∈ R. Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 65
Chapter 5
Bivariate Normal Distribution
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
I From the previous definition, we can conclude the (CDF)
Conditioning and
following facts: Independence
Functions of Two
I If X and Y are bivariate normal, then by letting a = 1, Random Variables
Conditional
b = 0, we conclude X must be normal. Expectation and
Conditional Variance
I If X and Y are bivariate normal, then by letting a = 0, Two Continuous
Random Variables
b = 1, we conclude Y must be normal. Joint Probability
I If X ∼ N(µX , σX2 ) and Y ∼ N(µY , σY2 ) are Density Function
(PDF)
independent, then they are jointly normal. Joint Cumulative
Distribution Function
I If X ∼ N(µX , σX2 ) and Y ∼ N(µY , σY2 ) are jointly (CDF)
Conditioning and
Independence
normal, then   Functions of Two
Continuous Random
2 2 Variables
X + Y ∼ N µX + µY , σX + σY + 2ρ(X , Y )σX σY .
More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 66
Chapter 5
Bivariate Normal Distribution
Two Discrete
Random Variables
I We will introduce the standard bivariate normal Joint Probability Mass
Function (PMF)
Joint Cumulative
distribution and then obtain the joint normal PDF in Distribution Function
(CDF)
general from it. Conditioning and
Independence
Functions of Two
Random Variables
Conditional
Expectation and
Conditional Variance
Two random variables X and Y are said to have the Two Continuous
Random Variables
standard bivariate normal distribution with corre- Joint Probability
Density Function
lation coefficient ρ if their joint PDF is given by (PDF)
Joint Cumulative
Distribution Function
(CDF)
1 1  2 2
 Conditioning and
fXY (x, y ) = exp{− x − 2ρxy + y } Independence
2(1 − ρ2 )
p
2π 1 − ρ2 Functions of Two
Continuous Random
Variables

More Topics of
where ρ ∈ (−1, 1). If ρ = 0, then we just say X and Y Two Random
Variables
have the standard bivariate normal distribution. Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 67
Chapter 5
Bivariate Normal Distribution
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Definition: Two random variables X and Y are said to Joint Cumulative
Distribution Function
have a bivariate normal distribution with parameters (CDF)
Conditioning and

µX , σX2 , µY , σY2 , and ρ, if their joint PDF is given by Independence


Functions of Two
Random Variables
Conditional
Expectation and
1 Conditional Variance
fXY (x, y ) = p · Two Continuous
2πσX σY 1 − ρ2 Random Variables
Joint Probability
x − µX 2 y − µY 2
   
1 Density Function
(PDF)
exp{− [ + Joint Cumulative
2(1 − ρ2 ) σX σY Distribution Function
(CDF)

(x − µX )(y − µY ) Conditioning and


Independence
− 2ρ ]} Functions of Two
σX σY Continuous Random
Variables

More Topics of
where µX , µY ∈ R, σX , σY > 0 and ρ ∈ (−1, 1) are all Two Random
Variables
constants. Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 68
Chapter 5
Bivariate Normal Distribution
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Theorem: Let X and Y be two bivariate normal ran- Random Variables
Conditional
dom variables. Then there exist independent standard Expectation and
Conditional Variance
normal random variables Z1 and Z2 such that Two Continuous
Random Variables
 Joint Probability
X = σX Z1 + µXp Density Function
(PDF)

Y = σY (ρZ1 + 1 − ρ2 Z2 ) + µY Joint Cumulative


Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 69
Chapter 5
Bivariate Normal Distribution
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)

Theorem: Suppose X and Y are jointly normal ran- Joint Cumulative


Distribution Function
(CDF)
dom variables with parameters µX , σX2 , µY , σY2 , and ρ. Conditioning and
Independence
Then, given X = x, Y is normally distributed with Functions of Two
Random Variables
Conditional
Expectation and
x − µX Conditional Variance
E [Y |X = x] = µY + ρσY , Two Continuous
σX Random Variables

Var (Y |X = x) = (1 − ρ2 )σY2 . Joint Probability


Density Function
(PDF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables
Theorem: If X and Y are bivariate normal and uncor- More Topics of
Two Random
related, then they are independent. Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 70
Chapter 5
Bivariate Normal Distribution
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
1

I Let X ∼ N(0, 1) and W ∼ Bernoulli 2 be Distribution Function
(CDF)

independent random variables. Define the random Conditioning and


Independence
Functions of Two
variable Y as a function of X and W : Random Variables
Conditional
 Expectation and
Conditional Variance
 X if W = 0
Two Continuous
Y = h(X , W ) = Random Variables
Joint Probability
−X if W = 1
 Density Function
(PDF)
Joint Cumulative
Distribution Function
Find the PDF of Y and X + Y . (CDF)
Conditioning and
Independence
I Let X and Y be jointly normal random variables with Functions of Two
Continuous Random

parameters µX , σX2 , µY , σY2 , and ρ. Find the


Variables

More Topics of
conditional distribution of Y given X = x. Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 71
Chapter 5
Bivariate Normal Distribution
Two Discrete
Random Variables
Joint Probability Mass
Function (PMF)
Joint Cumulative
Distribution Function
(CDF)
Conditioning and
Independence
Functions of Two
Random Variables
I Let X and Y be jointly normal random variables with Conditional
parameters µX = 1, σX2 = 1, µY = 0, σY2 = 4, and Expectation and
Conditional Variance

ρ = 21 . Two Continuous
Random Variables
1. Find P(2X + Y ≤ 3). Joint Probability
Density Function
2. Find Cov (X + Y , 2X − Y ). (PDF)
Joint Cumulative
Distribution Function
3. Find P(Y > 1|X = 2). (CDF)
Conditioning and
Independence
Functions of Two
Continuous Random
Variables

More Topics of
Two Random
Variables
Covariance and
Correlation
Bivariate Normal
Distribution

www.probabilitycourse.com 72

You might also like