You are on page 1of 3

Section 5.1, 5.

2 Joint Distributions of Continuous RVs

Example 5

Let θ ∼ Unif(0, 2π) and X = cos(θ), Y = sin(θ).


Lecture 18: Joint Distributions Find:

a) P[X + Y > 1]
Statistics 104
b) P[Y > 1/2]
Colin Rundel
c) Let θ ∼ Ex(1) and answer a) and b)
March 28, 2012

Statistics 104 (Colin Rundel) Lecture 17 March 28, 2012 1 / 10

Section 5.1, 5.2 Joint Distributions of Continuous RVs Section 5.1, 5.2 Joint Distributions of Continuous RVs

Example 5.a Example 5.b

Let θ ∼ Unif(0, 2π) and X = cos(θ), Y = sin(θ). Let θ ∼ Unif(0, 2π) and X = cos(θ), Y = sin(θ).

Z
1
fθ (z) = , for z ∈ (0, 2π) P[Y > 1/2] = fθ (z) dz

z:Y >1/2
Z 5π/6
Z 1
P[X + Y > 1] = fθ (z) dz = dz
π/6 2π
z:X +Y >1
z 5π/6

Z π/2
1 =
2π π/6

= dz
0 2π 5 1
z π/2
= − = 1/3
= = 1/4 12 12
2π 0

Statistics 104 (Colin Rundel) Lecture 17 March 28, 2012 2 / 10 Statistics 104 (Colin Rundel) Lecture 17 March 28, 2012 3 / 10
Section 5.1, 5.2 Joint Distributions of Continuous RVs Section 5.1, 5.2 Joint Distributions of Continuous RVs

Example 5.c Example 6 - 5.2.9

Let θ ∼ Exp(1) and X = cos(θ), Y = sin(θ). Consider a fishing experiment where we catch λ fish per hour.

Let X , Y be the times of first and second fish-catch events respectively.


−z Find:
fθ (z) = e , for x ∈ (0, ∞)
Z a) f (x, y )
P[X + Y > 1] = fθ (z) dz
z:X +Y >1 b) fX (x), fY (y )
Z π/2 Z 5π/2
= e −z dz + e −z dz + · · · c) Are X and Y independent?
0 2π
π/2 2π+π/2
= −e −z 0 + −e −z 2π + ··· d) P[Y > X + 2], what does this tell us about Y?
0 −π/2 −2π −(2π+π/2)
=e −e +e −e + ···
≈ 0.7936

Z
P[Y > 1/2] = fθ (z) dz
z:Y >1/2
Z 5π/6 Z 2π+5π/6
Statistics 104 (Colin Rundel) = e −z dzLecture
+ 17 e −z dz + · · · March 28, 2012 4 / 10 Statistics 104 (Colin Rundel) Lecture 17 March 28, 2012 5 / 10
π/6 2π+π/6
5π/6 2π+5π/6
= −e −z π/6 + −e −z 2π+π/6 + · · ·

= e −π/6 − e −5π/6
Section −(2π+π/6)
5.1, 5.2 + eJoint − eof−(2π+5π/6)
Distributions Continuous RVs
+ ··· Section 5.1, 5.2 Joint Distributions of Continuous RVs

≈ 0.5204
Example 6.a Example 6.a, cont.

Let X , Y be the times of first and second fish-catch events respectively As always, it is good to check that this is a proper joint density:
(λ = 1).
Z ∞ Z ∞ Z ∞ Z y
f (x, y ) dx dy = 2e −x−y dx dy
We can also think of this as M, N ∼ Exp(1) where X = min(X , Y ), −∞ −∞ 0 0
Z ∞
Y = max(X , Y ).
 y 
= −2e −x−y 0 dy
Z0 ∞ 
P(X ∈ (x, x + ), Y ∈ (y , y + )) 
= 2e −y − 2e −2y dy
= P(M ∈ (x, x + ), N ∈ (y , y + )) + P(N ∈ (x, x + ), M ∈ (y , y + )) 0

= −2e −y + e −2y

= P(M ∈ (x, x + ))P(N ∈ (y , y + )) + P(N ∈ (x, x + ))P(M ∈ (y , y + ))
0
=2−1=1
f (x, y ) = fM (x)fN (y ) + fM (y )fN (x)
= e −x e −y + e −y e −x
= 2e −x−y , for 0 ≤ x ≤ y ≤ ∞

Statistics 104 (Colin Rundel) Lecture 17 March 28, 2012 6 / 10 Statistics 104 (Colin Rundel) Lecture 17 March 28, 2012 7 / 10
Section 5.1, 5.2 Joint Distributions of Continuous RVs Section 5.1, 5.2 Joint Distributions of Continuous RVs

Example 6.b Example 6.c

Z ∞
If X and Y are independent then f (x, y ) = fX (x)fY (y ).
fX (x) = f (x, y ) dy
−∞ fX (x)fY (y ) = 2e −2x 2(e −y − e −2y ) = 4(e −2x−y − e −2x−2y )
Z ∞
= 2e −x−y dy 6=
x
∞ f (x, y ) = 2e −x−y
= −2e −x−y x
= 2e −2x
Therefore X and Y are not independent.
Z ∞
fY (y ) = f (x, y ) dy
−∞
Z y
= 2e −x−y dy
0
y
= −2e −x−y 0
= 2(e −y − e −2y )

Statistics 104 (Colin Rundel) Lecture 17 March 28, 2012 8 / 10 Statistics 104 (Colin Rundel) Lecture 17 March 28, 2012 9 / 10

Section 5.1, 5.2 Joint Distributions of Continuous RVs

Example 6.d

Z Z
P[Y > X + 2] = f (x, y ) dx dy
Y >X +2
Z ∞ Z y −2
= 2e −x−y dx dy
Z2 ∞ 0
y −2 
= −2e −x−y 0 dy
Z2 ∞  
= 2e −y − 2e −2y +2 dy
2

= −2e −y + e −2y +2

2

= 2e −2 − e −2
= e −2 = 0.135

Statistics 104 (Colin Rundel) Lecture 17 March 28, 2012 10 / 10

You might also like