Professional Documents
Culture Documents
Lec 17
Lec 17
6 5 15 6 6 36 6 5 15
P(B=k) 12 11 = 66 2 12 11 = 66 12 11 = 66
8 7 28 4 8 32 4 3 6
P(W=k) 12 11 = 66 2 12 11 = 66 12 11 = 66
6 6 4 8
k
Note - B ∼ HyperGeo(12, 6, 2) = 2−k
and W ∼ HyperGeo(12, 4, 2) = k
2−k
12 12
2 2
Section 5.1 Joint Distributions of Discrete RVs Section 5.1 Joint Distributions of Discrete RVs
Let B be the number of Black socks and W the number of White socks Note that the column and row sums are the distributions of B and W
drawn, then the joint distribution of B and W is given by: respectively.
6 4 2
b w 2−b−w
P(B = b, W = w ) = 12
, for 0 ≤ b, w ≤ 2 and b + w ≤ 2
2
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 2 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 3 / 32
Section 5.1 Joint Distributions of Discrete RVs Section 5.1 Joint Distributions of Continuous RVs
1 15 1
66 66 = 15 if W = 0
P(W = w , B = 0)
8 15 8
P(W = w |B = 0) = = 66 66 = 15 if W = 1
P(B = 0)
6 15
6
66 66 = 15 if W = 2 Y
(x,y)
●
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 4 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 5 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs
The joint Cumulative distribution function follows the same rules as the We can define marginal distributions based on the CDF by setting one of
univariate CDF, the values to infinity:
Univariate definition: Z x Z ∞
Z x F (x, ∞) = P(X ≤ x, Y ≤ ∞) = f (x, y ) dy dy
F (x) = P(X ≤ x) = f (z)dz
−∞
−∞ −∞
Z y Z ∞
Bivariate definition: F (∞, y ) = P(X ≤ ∞, Y ≤ y ) = f (x, y ) dx dy
−∞ −∞
Z y Z x
F (x, y ) = P(X ≤ x, Y ≤ y ) = f (x, y ) dx dy = P(Y ≤ y ) = FY (y )
−∞ −∞
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 6 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 7 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs
Similar to the CDF the probability density function follows the same Marginal probability density functions are defined in terms of “integrating
general rules except in two dimensions, out” one of the random variables.
Z ∞
Univariate definition: fX (x) = f (x, y ) dy
d
R∞ −∞
f (x) ≥ 0 for all x f (x) = dx
F (x) −∞ f (x)dx = 1 Z ∞
fY (x) = f (x, y ) dx
−∞
Bivariate definition:
f (x, y ) ≥ 0 for all (x, y ) Previously we defined independence in terms of E (XY ) = E (X )E (Y ) ⇒
X and Y are independent. This is equivalent in the joint case of
f (x, y ) =
∂ ∂
F (x, y )
f (x, y ) = fX (x)fY (y ) ⇒ X and Y are independent.
∂x ∂y
Z ∞ Z ∞
f (x, y ) dx dy = 1
−∞ −∞
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 8 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 9 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs
Z ∞ Z ∞
E [g (X , Y )] = g (x, y ) · f (x, y ) dx dy
−∞ −∞
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 10 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 11 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 12 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 13 / 32
Which should not be surprising...
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 14 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 15 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs
If we did not feel comfortable coming up with the graphical arguments for Let X and Y be drawn uniformly from the triangle below
F (x, y ) we can also use the fact that the pdf is constant on (0, 1) × (0, 1)
to derive the same distribution / density. 4
f (x, y ) = c 3
Z ∞ Z ∞
1= f (x, y ) dx dy Y 2
−∞ −∞
Z 1Z 1
= c dx dy
0 0 1
Z 1 Z 1
= cx|10 dy = c dy
0 0
= cy |10 = c 0
0 1 2 3 4
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 16 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 17 / 32
Since the joint density is constant then Depending on which range definition you choose it makes life easier when
evaluating the marginal densities.
2
f (x, y ) = c = , for 0 ≤ x + y ≤ 3 ∞
9
Z
fX (x) = f (x, y ) dy
based on the area of the triangle, but we need to be careful to define on −∞
Z 3−x
what range. We can define the range in two ways since X and Y depend 2
= dy
on each other, so we can define the range of X in terms of Y or Y in 0 9
2
terms of X . = (3 − x) for x ∈ (0, 3)
9
( Z ∞
2
9 if y ∈ (0, 3), x ∈ (0, 3 − y ) fY (y ) = f (x, y ) dy
f (x, y ) = −∞
0 otherwise Z 3−y
2
( = dx
2 9
9 if x ∈ (0, 3), y ∈ (0, 3 − x) 0
= 2
0 otherwise = (3 − y ) for y ∈ (0, 3)
9
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 18 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 19 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs
Finding the CDF with calculus is hard in this case, still a pain with Let f (x, y ) = cx 2 y for x 2 ≤ y ≤ 1.
graphical approaches but easier...
Find:
Z y Z x
F (x, y ) = f (x, y ) dx dy a) c
−∞ −∞
b) P[X ≥ Y ]
0 if x ∈ (−∞, 0) or y ∈ (−∞, 0)
2
xy if x ∈ (0, 3), y ∈ (0, 3), x + y ∈ (0, 3)
9 c) fX (x) and fY (y )
(y −(3−x))(x−(3−y ))
29 xy − if x ∈ (0, 3), y ∈ (0, 3), x + y ∈ (3, 6)
2
= 2 x2
9 3x −
2
if x ∈ (0, 3), y ∈ (3, ∞)
y2
2
3y − if x ∈ (3, ∞), y ∈ (0, 3)
9 2
1 if x ∈ (3, ∞), y ∈ (3, ∞)
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 20 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 21 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 22 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 23 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs
0.15
1 1
0.15
f(x,y) 0.10
0.10
0.05
Y Y
0.00
1.0
0.8 1.0 0.05
0.6 0.5
0.4 0.0
y 0.2 −0.5
x −1 0 1 −1 0 1
0.0 −1.0 X X
0.00
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 24 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 25 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs
Z
21 2 1
Z 1 Z |x| 21 2 fX (x) = x y dy
P(X ≥ Y ) = x y dy dx 4 x2
−1 x2 4 1 !
1 x 21 x 2 y 2
=
Z Z
21 2
=2 x y dy dx 4 2 x 2
0 4
x2
Z 1 2 2 x 21 2
x − x 6 , for x ∈ (−1, 1)
42 x y =
= dx 8
4 0 2 x 2
Z 1 4
x6
42 x √
= − dx Z y 21 2
4 0 2 2 fY (y ) = x y dx
√ 4
5 1 − y
42 x x 7 √y !
= − 21 x 3 y
4 10 14 0 =
42
1 1
4 3 −√y
= − !
4 10 14 21 y 5/2
= 2
42 2 4 3
= = 0.3
4 70 7 5/2
= y , for y ∈ (0, 1)
2
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 26 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 27 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs
It is always a good idea to check that the marginals are proper densities. Let Y be the rate of calls at a help desk, and X the number of calls
between 2 pm and 4 pm one day; Let’s say that:
Z 1 Z 1 21 2
x − x 6 dx (2y )x −3y
fX (x)dx =
−1 −1 8 f (x, y ) = e
21
3
x
1
x 7 x!
= −
8 3 7 −1 for y > 0, x = 0, 1, 2, . . ..
21 1 1
= − =1
4 3 7 Find:
Z 1 Z 1
7 5/2
a) P(X = 0)
fY (y )dy = y dy
0 0 2
b) P(Y > 2)
7 2 7/2 1
= y =1
27 0 c) P[X = x] for all x
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 28 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 29 / 32
Section 5.1 Joint Distributions of Continuous RVs Section 5.1 Joint Distributions of Continuous RVs
Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 30 / 32 Statistics 104 (Colin Rundel) Lecture 17 March 26, 2012 31 / 32
Section 5.1 Joint Distributions of Continuous RVs
Example 4.c
(2y )x −3y
f (x, y ) = e , for y > 0, x = 0, 1, 2, . . .
x!
Z ∞
P(X = x) = f (x, y )dy
0
Z ∞ (2y )x −3y
= e dy
0 x!
2x
=
3x+1