You are on page 1of 9

Section 5.

1 Joint Distributions of Discrete RVs

Joint Distribution - Example

Draw two socks at random, without replacement, from a drawer full of


Lecture 17: Joint Distributions twelve colored socks:

6 black, 4 white, 2 purple


Statistics 104
Let B be the number of Black socks, W the number of White socks
Colin Rundel drawn, then the distributions of B and W are given by:

March 26, 2012 0 1 2

6 5 15 6 6 36 6 5 15
P(B=k) 12 11 = 66 2 12 11 = 66 12 11 = 66

8 7 28 4 8 32 4 3 6
P(W=k) 12 11 = 66 2 12 11 = 66 12 11 = 66
     
6 6 4 8
k
Note - B ∼ HyperGeo(12, 6, 2) =  2−k
 and W ∼ HyperGeo(12, 4, 2) = k
 2−k

12 12
2 2

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 1 / 32

Section 5.1 Joint Distributions of Discrete RVs Section 5.1 Joint Distributions of Discrete RVs

Joint Distribution - Example, cont. Marginal Distributions

Let B be the number of Black socks and W the number of White socks Note that the column and row sums are the distributions of B and W
drawn, then the joint distribution of B and W is given by: respectively.

P(B = b) = P(B = b, W = 0) + P(B = b, W = 1) + P(B = b, W = 2)


W 
1/66 If b=0,w=0
P(W = w ) = P(B = 0, W = w ) + P(B = 1, W = w ) + P(B = 2, W = w )


8/66 If b=0,w=1


0 1 2






6/66 If b=0,w=2
1 8 6 15

0 12/66 If b=1,w=0

66 66 66 66

These are the marginal distributions of B and W . In general,
P(B = b, W = w ) = 24/66 If b=1,w=1
12 24 36 
B 1 0 0/66 If b=1,w=2

 X X
66 66 66 P(X = x) = P(X = x, Y = y ) = P(X = x|Y = y )P(Y = y )




15/66 If b=2,w=0
15 15

2 0 0 y y


66 66


0/66 If b=2,w=1

0/66 If b=2,w=2

28 32 6 66
66 66 66 66

6 4 2
  
b w 2−b−w
P(B = b, W = w ) = 12
 , for 0 ≤ b, w ≤ 2 and b + w ≤ 2
2

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 2 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 3 / 32
Section 5.1 Joint Distributions of Discrete RVs Section 5.1 Joint Distributions of Continuous RVs

Conditional Distribution Joint CDF

Conditional distributions are defined as we have seen previously with

P(X = x, Y = y ) joint pmf F (x, y ) = P[X ≤ x, Y ≤ y ]


P(X = x|Y = y ) = = = P[(X , Y ) lies south-west of the point (x, y )]
P(Y = y ) marginal pmf
Therefore the pmf for white socks given no black socks were drawn is

 
1 15 1
 66  66 = 15 if W = 0
P(W = w , B = 0) 
8 15 8
P(W = w |B = 0) = = 66 66 = 15 if W = 1
P(B = 0) 
 6 15
 6
66 66 = 15 if W = 2 Y
(x,y)

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 4 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 5 / 32

Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Joint CDF, cont. Marginal Distributions

The joint Cumulative distribution function follows the same rules as the We can define marginal distributions based on the CDF by setting one of
univariate CDF, the values to infinity:

Univariate definition: Z x Z ∞
Z x F (x, ∞) = P(X ≤ x, Y ≤ ∞) = f (x, y ) dy dy
F (x) = P(X ≤ x) = f (z)dz
−∞
−∞ −∞

lim F (x) = 0 lim F (x) = 1 x ≤ y ⇒ F (x) ≤ F (y ) = P(X ≤ x) = FX (x)


x→−∞ x→∞

Z y Z ∞
Bivariate definition: F (∞, y ) = P(X ≤ ∞, Y ≤ y ) = f (x, y ) dx dy
−∞ −∞
Z y Z x
F (x, y ) = P(X ≤ x, Y ≤ y ) = f (x, y ) dx dy = P(Y ≤ y ) = FY (y )
−∞ −∞

lim F (x, y ) = 0 lim F (x, y ) = 1 x ≤ x 0, y ≤ y 0 ⇒


x,y →−∞ x,y →∞
F (x, y ) ≤ F (x 0 , y 0 )

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 6 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 7 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Joint pdf Marginal pdfs

Similar to the CDF the probability density function follows the same Marginal probability density functions are defined in terms of “integrating
general rules except in two dimensions, out” one of the random variables.
Z ∞
Univariate definition: fX (x) = f (x, y ) dy
d
R∞ −∞
f (x) ≥ 0 for all x f (x) = dx
F (x) −∞ f (x)dx = 1 Z ∞
fY (x) = f (x, y ) dx
−∞
Bivariate definition:
f (x, y ) ≥ 0 for all (x, y ) Previously we defined independence in terms of E (XY ) = E (X )E (Y ) ⇒
X and Y are independent. This is equivalent in the joint case of
f (x, y ) =
∂ ∂
F (x, y )
f (x, y ) = fX (x)fY (y ) ⇒ X and Y are independent.
∂x ∂y
Z ∞ Z ∞
f (x, y ) dx dy = 1
−∞ −∞

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 8 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 9 / 32

Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Probability and Expectation Example 1 - Joint Uniforms


Univariate definition: Z Let X , Y ∼ Unif(0, 1), it is straight forward to see graphically that
P(X ∈ A) = f (x) dx
A 
Z ∞  0 if x ∈ (−∞, 0) or y ∈ (−∞, 0)
E [g (X )] = g (x) · f (x) dx


xy if x ∈ (0, 1), y ∈ (0, 1)
−∞



Bivariate definition: F (x, y ) = x if x ∈ (0, 1), y ∈ (1, ∞)

y if x ∈ (1, ∞), y ∈ (0, 1)



Z Z 
1 if x ∈ (1, ∞), y ∈ (1, ∞)

P(X ∈ A, Y ∈ B) = f (x, y ) dx dy
A B

Z ∞ Z ∞
E [g (X , Y )] = g (x, y ) · f (x, y ) dx dy
−∞ −∞

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 10 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 11 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Example 1, cont. Example 1, cont.


Based on the CDF we can calculate the pdf using the 2nd partial Based on the pdf we can calculate the marginal densities:
derivative with regard to x and y. (
1 if x ∈ (0, 1), y ∈ (0, 1)
f (x, y ) =
∂ ∂ 0 otherwise
f (x, y ) = F (x, y )
∂x ∂y
 Z ∞
0 if x ∈ (−∞, 0) or y ∈ (−∞, 0)

 fX (x) = f (x, y ) dy
1 if x ∈ (0, 1), y ∈ (0, 1)

−∞


= 0 if x ∈ (0, 1), y ∈ (1, ∞)
(R
1
1 dy if x ∈ (0, 1)


0 if x ∈ (1, ∞), y ∈ (0, 1) = R0
0 dy otherwise




0 if x ∈ (1, ∞), y ∈ (1, ∞) (
( 1 if x ∈ (0, 1)
1 if x ∈ (0, 1), y ∈ (0, 1) =
= 0 otherwise
0 otherwise
(
1 if y ∈ (0, 1)
fY (y ) =
0 otherwise

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 12 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 13 / 32
Which should not be surprising...

Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Example 1, cont. Example 1, cont.


Expectation is also straight forward:
Z ∞ Z ∞ Z ∞ Z ∞
E (X ) = xf (x, y ) dx dy E (XY ) = xyf (x, y ) dx dy
−∞ −∞
−∞ −∞
1Z 1 1
1 !
x 2
Z Z 1 !
Z 1 Z 1 Z 1 x 2 y
= x dx dy = dy = xy dx dy = dy
0 0 0 2 0 2
0 0 0 x=0
1 1 1
Z
1 1
1
y 2
Z
= dy = y = 1/2 y
0 2 2 0 = dy = = 1/4
0 2 4 0
Z ∞ Z ∞
E (Y ) = yf (x, y ) dx dy
−∞ −∞ Note that E (XY ) = E (X )E (Y ), what does this tell us about X and Y ?
Z 1Z 1 Z 1  
= y dx dy = xy |10 dy
0 0 0
1
1
y 2
Z
= y dy = = 1/2
0 2 0

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 14 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 15 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Example 1, another way Example 2

If we did not feel comfortable coming up with the graphical arguments for Let X and Y be drawn uniformly from the triangle below
F (x, y ) we can also use the fact that the pdf is constant on (0, 1) × (0, 1)
to derive the same distribution / density. 4

f (x, y ) = c 3

Z ∞ Z ∞
1= f (x, y ) dx dy Y 2
−∞ −∞
Z 1Z 1
= c dx dy
0 0 1
Z 1   Z 1
= cx|10 dy = c dy
0 0
= cy |10 = c 0
0 1 2 3 4

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 16 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 17 / 32

Find the joint pdf, cdf, and marginals.


Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Example 2, cont. Example 2, cont.

Since the joint density is constant then Depending on which range definition you choose it makes life easier when
evaluating the marginal densities.
2
f (x, y ) = c = , for 0 ≤ x + y ≤ 3 ∞
9
Z
fX (x) = f (x, y ) dy
based on the area of the triangle, but we need to be careful to define on −∞
Z 3−x
what range. We can define the range in two ways since X and Y depend 2
= dy
on each other, so we can define the range of X in terms of Y or Y in 0 9
2
terms of X . = (3 − x) for x ∈ (0, 3)
9

( Z ∞
2
9 if y ∈ (0, 3), x ∈ (0, 3 − y ) fY (y ) = f (x, y ) dy
f (x, y ) = −∞
0 otherwise Z 3−y
2
( = dx
2 9
9 if x ∈ (0, 3), y ∈ (0, 3 − x) 0
= 2
0 otherwise = (3 − y ) for y ∈ (0, 3)
9

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 18 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 19 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Example 2, cont. Example 3

Finding the CDF with calculus is hard in this case, still a pain with Let f (x, y ) = cx 2 y for x 2 ≤ y ≤ 1.
graphical approaches but easier...
Find:
Z y Z x
F (x, y ) = f (x, y ) dx dy a) c
−∞ −∞

b) P[X ≥ Y ]


 0 if x ∈ (−∞, 0) or y ∈ (−∞, 0)
2
xy if x ∈ (0, 3), y ∈ (0, 3), x + y ∈ (0, 3)


9  c) fX (x) and fY (y )

 

(y −(3−x))(x−(3−y ))
 29 xy − if x ∈ (0, 3), y ∈ (0, 3), x + y ∈ (3, 6)


2
= 2 x2

 9 3x −

 2

if x ∈ (0, 3), y ∈ (3, ∞)
y2

2
3y − if x ∈ (3, ∞), y ∈ (0, 3)



 9 2


1 if x ∈ (3, ∞), y ∈ (3, ∞)

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 20 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 21 / 32

Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Example 3 - Range Example 3.a

Let f (x, y ) = cx 2 y for x 2 ≤ y ≤ 1, we can rewrite the bounds as


1 √ √
0 ≤ y ≤ 1, − y ≤ x ≤ y
Z ∞ Z ∞
1= f (x, y ) dx dy
−∞ −∞

Z 1Z y
= cx 2 y dx dy
0 −1
1
√y !
x 3
Z
Y = cy dy
0 3 x=−√y
Z 1  
= cy 5/2 /3 + cy 5/2 /3 dy
0
1
4cy 7/2
=
21
y =0
4
= c
21
−1 0 1 c = 21/4
X

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 22 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 23 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Example 3 - pdf Example 3.b


0.20
We need to integrate over the region where x 2 ≤ y ≤ 1 and x ≥ y which
is indicated in red below

0.15

1 1
0.15

f(x,y) 0.10
0.10

0.05
Y Y

0.00
1.0
0.8 1.0 0.05
0.6 0.5

0.4 0.0
y 0.2 −0.5
x −1 0 1 −1 0 1
0.0 −1.0 X X

0.00

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 24 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 25 / 32

Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Example 3.b, cont. Example 3.c

Z
21 2 1
Z 1 Z |x| 21 2 fX (x) = x y dy
P(X ≥ Y ) = x y dy dx 4 x2
−1 x2 4 1 !
1 x 21 x 2 y 2
=
Z Z
21 2
=2 x y dy dx 4 2 x 2
0 4
x2
Z 1  2 2 x  21 2
x − x 6 , for x ∈ (−1, 1)

42 x y =
= dx 8
4 0 2 x 2
Z 1 4
x6

42 x √
= − dx Z y 21 2
4 0 2 2 fY (y ) = x y dx
√ 4
 5  1 − y
42 x x 7 √y !
= − 21 x 3 y
4 10 14 0 =
42

1 1
 4 3 −√y
= − !
4 10 14 21 y 5/2
  = 2
42 2 4 3
= = 0.3
4 70 7 5/2
= y , for y ∈ (0, 1)
2

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 26 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 27 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Example 3.c, cont. Example 4

It is always a good idea to check that the marginals are proper densities. Let Y be the rate of calls at a help desk, and X the number of calls
between 2 pm and 4 pm one day; Let’s say that:
Z 1 Z 1 21 2
x − x 6 dx (2y )x −3y

fX (x)dx =
−1 −1 8 f (x, y ) = e
21
 3
x
 1
x 7 x!
= −
8 3 7 −1 for y > 0, x = 0, 1, 2, . . ..
 
21 1 1
= − =1
4 3 7 Find:
Z 1 Z 1
7 5/2
a) P(X = 0)
fY (y )dy = y dy
0 0 2
b) P(Y > 2)
7 2 7/2 1

= y =1
27 0 c) P[X = x] for all x

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 28 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 29 / 32

Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs

Example 4.a Example 4.b

(2y )x −3y (2y )x −3y


f (x, y ) = e , for y > 0, x = 0, 1, 2, . . . f (x, y ) = e , for y > 0, x = 0, 1, 2, . . .
x! x!
Z ∞
∞X
Z ∞
P(X = 0) = f (0, y )dy P(Y > 2) = f (x, y )dy
2 x=0
0
Z ∞ ∞X∞
(2y )x −3y
Z
= e −3y dy = e dy
0
∞ 2 x=0
x!
1
= − e −3y


Z ∞ !
X (2y )x
3 0 = e −3y dy
2 x=0
x!
= 1/3

Z ∞ !
X (2y ) (2y )2
= e −3y 1+ + + · · · dy
2 x=0
1 2
Z ∞ Z ∞
e −3y e 2y dy = e −y dy

=
2 2

= −e −y 2 = e −2 = 0.13534

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 30 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 31 / 32
Section 5.1 Joint Distributions of Continuous RVs

Example 4.c

(2y )x −3y
f (x, y ) = e , for y > 0, x = 0, 1, 2, . . .
x!
Z ∞
P(X = x) = f (x, y )dy
0
Z ∞ (2y )x −3y
= e dy
0 x!
2x
=
3x+1

Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 32 / 32

You might also like