Professional Documents
Culture Documents
an
lim
Di
UP
s,
ic
at
em
h
at
M
of
te
itu
st
In
Chapter 6: Joint Distributions
Previous chapters dealt with univariate distributions. In many applications, there will be
distributions involving two or more random variables.
an
lim
In order to study distributions of two random variables, we have the following definitions.
Di
We start with 2 discrete random variables.
UP
Definition 6.1 Let X and Y be two discrete random variables defined over the same prob-
s,
ability space. The joint probability mass function (joint pmf ) of X and Y is the
ic
at
x and Y (ω) = y}. We also define the joint range of X and Y to be the set
te
itu
Remark: The joint pmf of X and Y should satisfy the following properties:
1. 0 ≤ fX,Y (x, y) ≤ 1
XX
2. fX,Y (x, y) = 1
y x
X
3. P[(X, Y ) ∈ A] = fX,Y (x, y), A ⊆ R2 .
(x,y)∈A∩ran(X,Y )
2
Chapter 6. Joint Distributions 3
Definition 6.2 Two random variables X and Y are said to be jointly continuous if
there exists a non-negative function fX,Y : R2 → R, such that for any subset D ⊆ R2 ,
¨
P[(X, Y ) ∈ D] = fX,Y (x, y) dA.
D
Such function is called the joint probability density function (joint pdf ) of X and Y .
In this case, the joint range of X and Y is defined as
Remark:
an
1. Just like in the discrete case, the joint pdf of X and Y must satisfy
lim
¨ ¨
Di
fX,Y (x, y) dA = fX,Y (x, y) dA = 1.
R2 ran(X,Y ) UP
2. In general, two random variables do not need to be both continuous to have
ic s,
a continuous joint pdf. But for the sake of clarity and coherence, when we
at
talk about a continuous joint distribution, we shall assume that both random
em
Examples.
of
able X represents the number of accidents in town A and Y for town B. The
itu
e−2
p(x, y) = , x = 0, 1, 2, . . . ; y = 0, 1, 2, . . .
x!y!
Verifying,
X X e−2 ∞ ∞
−2
X 1 X 1
=e = e−2 (e)(e) = 1.
y x
x!y! y=0
y! x=0 x!
Suppose we want to find the probability that there will be 1 accident in town
e−2 1
A and 2 accidents in town B. This is given by p(1, 2) = = 2.
1!2! 2e
2. Suppose that the joint pdf of X and Y is given by
(
x + cy 2 , 0 ≤ x, y ≤ 1
f (x, y) = .
0 , otherwise
Chapter 6. Joint Distributions 4
an
36 48
lim
1
3. Consider rolling a pair of unbiased dice. Each outcome has probability . Let
Di
36
X be the smaller and Y the larger outcome on a roll. The joint pdf of X and
UP
Y is given by
1
s,
, 1≤x=y≤6
ic
36
at
p(x, y) =
2, 1≤x<y≤6
em
36
h
at
Y \X 1 2 3 4 5 6 row total
of
1 1
1
te
36 36
itu
2 1 3
2 36 36 36
st
2 2 1 5
3 36 36 36 36
In
2 2 2 1 7
4 36 36 36 36 36
2 2 2 2 1 9
5 36 36 36 36 36 36
2 2 2 2 2 1 11
6 36 36 36 36 36 36 36
11 9 7 5 3 1
column total 36 36 36 36 36 36
1
The row and column totals of the values in the table above are listed. The
column totals are the respective probabilities that X = x, x = 1, 2, . . . , 6. Sim-
ilarly, the row totals are the respective probabilities that Y = y, y = 1, 2, . . . , 6
Moreover, observe that all these values are less than 1 and the sum of row and
column totals are both equal to 1. These give rise to the concept of marginal
probabilities.
Chapter 6. Joint Distributions 5
The joint pmf/pdf of X and Y contains all information regarding the distributions of both
X and Y , i.e., we can obtain the pmf/pdf of X and Y from their joint pmf/pdf.
Definition 6.3 Let X and Y have the joint pmf pX,Y (x, y). The respective marginal
probability mass functions (marginal pmf ) of X and Y are defined as
X X
pX (x) = pX,Y (x, y) and pY (y) = pX,Y (x, y).
y x
Definition 6.4 Let X and Y have the joint pdf fX,Y (x, y). The respective marginal
probability density functions (marginal pdf ) of X and Y are defined as
an
ˆ ∞ ˆ ∞
lim
fX (x) = fX,Y (x, y) dy and fY (y) = fX,Y (x, y) dx.
−∞ −∞
Di
Examples. UP
e−2
1. Given p(x, y) = , x, y = 0, 1, 2, . . .,
s,
x!y!
ic
at
X e−2 ∞
e−2 X 1 e−1
pX (x) = = = , x = 0, 1, . . .
em
y
x!y! x! y=1
y! x!
h
at
e−1
M
Similarly, pY (y) = , y = 0, 1, . . .
y!
Hence X, Y ∼ P oi(1).
of
te
12 − (2x − 1) 2y − 1
st
36 36
Definition 6.5 Let X and Y be random variables. The joint cumulative distribution func-
tion of X and Y , denoted by FX,Y (x, y) or simply F (x, y) is given by
Remark: If X and Y are continuous random variables with joint pdf f and joint cdf F ,
then ˆ ˆ
y x
F (x, y) = f (s, t) ds dt
−∞ −∞
and that
∂2
f (x, y) = F (x, y).
∂x∂y
Chapter 6. Joint Distributions 6
Theorem 6.6 The joint cdf F (x, y) of any two random variables X and Y satisfies the
following properties:
P(x1 < X < x2 , y1 < Y < y2 ) = F (x2 , y2 )−F (x2 , y1 )−F (x1 , y2 )+F (x1 , y1 ) ≥ 0
an
lim F (x + h, y) = lim+ F (x, y + h) = F (x, y).
lim
h→0+ h→0
Di
6. FX (x) = limy→∞ F (x, y) is the marginal cdf of X.
7. FY (y) = limx→∞ F (x, y) is the marginal cdf of Y .
UP
s,
Examples:
ic
at
with sides labelled 1 to 4. Let X denote the number on the down-turned face
h
at
of the first tetrahedron and Y the larger of numbers in the down-turned faces.
M
Solution: Assuming that the experiment is conducted fairly, the joint pmf is
itu
1
given p(x, y) = , for all x, y = 1, 2, 3, 4. We consider all outcomes wherein
st
16
the number in the down-turned face of the 1st tetrahedron is 1 or 2; and that
In
P(X > 1, Y < 1), P(X < Y ), and P(X < a), a > 0.
Chapter 6. Joint Distributions 7
Solution:
ˆ 1 ˆ ∞
P(X > 1, Y < 1) = 2e−x e−2y dx dy
ˆ0 1
1
∞
= 2e−2y −e−x dy
0 1
ˆ 1
−1 1 − e−2
= e 2e−2y dy = .
0 e
ˆ ∞ ˆ y
P(X < Y ) = 2e−x e−2y dx dy
ˆ0 ∞ 0
1
= 2e−2y (1 − e−y ) dy = .
0 3
an
ˆ a ˆ ∞
lim
−x −2y
P(X < a) = 2e e dy dx
ˆ0 0
Di
a
= e−x dx = 1 − e−a .
UP
0
(
K(x + y), 0 < x < 1, 0 < y < 1
em
f (x, y) =
0, elsewhere
h
at
1
Solve for K, fX (x), P X < and P (X + Y ≤ 1).
M
2
of
ˆ 1ˆ 1 ˆ 1
itu
1
1= K(x + y) dxdy = K +y dy = K.
2
st
0 0 0
In
ˆ 1 1
y2
1
fX (x) = (x + y) dy = xy + = x + , 0 < x < 1.
0 2 0 2
ˆ 1/2 ˆ 1/2
1 1 3
P X< = fX (x) dx = x+ dx = .
2 0 0 2 8
ˆ 1 ˆ 1−x
P (X + Y ≤ 1) = (x + y) dy dx
0 0
ˆ 1
(1 − x)2
= x(1 − x) + dx
0 2
1
= .
3
Chapter 6. Joint Distributions 8
Theorem 6.8 Let X and Y be independent random variables. Then the following are
satisfied:
an
lim
where FX and FY are the marginal cdf ’s of X and Y , resp.
2. If pX,Y is the joint pmf of X and Y , then
Di
UP
pX,Y (x, y) = P(X = x, Y = yx) = pX (x) · pY (y),
ic s,
Example: Let X and Y be independent uniformly distributed random variables in (0, 60).
itu
x + 10 < y
60
R1
y + 10 < x
R2
10
0 10 60
an
Exercises:
lim
Di
1. Identify whether X and Y are independent given the joint pdf f (x, y).
xy 2 UP
a. f (x, y) = , x = 1, 2, 3, y = 1, 2
30
xy 2
s,
13
at
The definition of joint distributions and independence can be extended to n random vari-
M
ables.
of
te
Definition 6.10 If X and Y are both discrete or continuous random variables with joint
pmf/pdf f (x, y), then the conditional probability mass/density function of Y given X = x,
is defined to be
f (x, y)
f (y|x) = ,
fX (x)
given fX (x) > 0.
Similarly, the conditional pmf/pdf of X given Y = y is
an
lim
f (x, y)
f (x|y) = ,
fY (y)
Di
given fY (y) > 0.
UP
s,
Remark. If X and Y are independent random variables, then the conditional density of
ic
at
fX (x)fY (y)
f (y|x) = = fY (y).
h
fX (x)
at
M
X
p(yj |x), X, Y discrete
{j:y ≤y}
j
FY |X (y|x) = ˆ
y
f (z|x) dz, X, Y continuous
−∞
2.
ˆ b
P[a < Y < b | X = x] = f (y|x) dy
a
and ˆ d
P[c < X < d | Y = y] = f (x|y) dx.
c
Examples:
Chapter 6. Joint Distributions 11
an
2. If f (x, y) = xe−x(y+1) , x, y ≥ 0. Find f (y|x).
lim
Solution:
xe−x(y+1) xe−x(y+1)
Di
f (y|x) = ˆ ∞ = = xe−xy .
e−x
xe−x(y+1) dy UP
0
3. Suppose f (x, y) = (x + y) I(0,1)) (x) I(0,1) (y). Find f (y|x) and F (y|x).
ic s,
Solution:
at
x+y x+y
f (y|x) = ˆ
em
1 = .
x + 12
(x + y) dy
h
at
0
Then
M
ˆ y y ˆ
(x + z)
of
F (y|x) = f (z|x) dz = 1 dz
−∞ x + 2
te
−∞
itu
y2
1
= xy + , 0 < y < 1.
x + 12 2
st
In
Exercises:
1. Let
6x2 y, 0 < x < 1, 0 < y < 1
f (x, y) =
0,
elsewhere
be the joint pdf of X and Y . Find P(0 < X < 34 , 13 < Y < 1).
2. Given joint pdf
6(1 − x − y), 0 < x, y < 1, 0 < x + y ≤ 1
f (x, y) = ,
0,
o.w.
find P(0 ≤ X ≤ 21 ).
Chapter 6. Joint Distributions 12
3. Let X and Y have joint pdf f (x, y) = ke−x e−y , for all 0 < x < y < ∞. Find k
and the marginal pdf of X, x > 0?
4. If f (x, y) = k(x2 +y 2 ) is a joint pdf of continuous random variables X and Y over
the unit square bounded by (0, 0), (1, 0), (0, 1), and (1, 1). Find P(X + Y ≥ 1).
1
5. The joint cdf of X and Y is F (x, y) = (3x3 y + 2x2 y 2 ), 0 < x, y < 1. Find
5
f 21 , 12 .
an
lim
Di
UP
0 1
ic s,
1
y= 2 −x
at
hem
6.4 Expectation
at
M
Now we discuss the joint, marginal, and conditional expectations of several random vari-
of
te
ables.
itu
Definition 6.12 Given a function g(X, Y ) of jointly distributed random variables X and
st
In
Y with defined joint density function, then the expectation of g(X, Y ) is defined
XX
g(x, y)p(x, y), X, Y discrete
y x
E[g(X, Y )] =
ˆ ∞ˆ ∞
g(x, y)f (x, y), X, Y continuous
−∞ −∞
To define marginal expectations, one makes use of the marginal density functions, de-
fined as follows.
Definition 6.13 Given jointly distributed random variables X and Y with defined joint
Chapter 6. Joint Distributions 13
Similarly, X
yfY (y), X, Y discrete
y
E[Y ] =
ˆ ∞
yfY (y), X, Y continuous
−∞
Examples:
an
lim
1. Gien f (x, y) = 6xy 2 , 0 < x < 1, 0 < y < 1, find E(XY ).
Solution:
Di
ˆ 1 ˆ 1
1
E(XY ) = xy 6xy 2 dxdy = .
UP
0 0 2
2. Let f (x, y) = (x + y) I(0,1)) (x) I(0,1) (y). Find E(XY ), E(X + Y ), E(X), and
ic s,
E(Y ).
at
Solution:
em
ˆ 1 ˆ 1
1
h
E(XY ) = xy (x + y) dxdy =
at
3
ˆ0 1 ˆ0 1
M
7
E(X + Y ) = (x + y)2 dxdy =
of
6
ˆ0 1 0 ˆ 1 ˆ 1
te
7
E(X) = xfX (x) dx = x (x + y) dy dx = .
itu
0 0 0 12
ˆ 1 ˆ 1 ˆ 1
st
7
E(Y ) = yfY (y) dy = x (x + y) dx dy = .
In
0 0 0 12
3. An accident occurs at a point X that is uniformly distributed on a road of
length L. At the time of the accident, an ambulance is at a location Y that is
also uniformly distributed on the road. Assuming X and Y are independent,
find the expected distance between the ambulance and the point of accident.
Solution: X ∼ U (0, L) and Y ∼ U (0, L). Since X and Y are independent, then
1
f (x, y) = fX (x)fY (y) = 2 . So
L
ˆ Lˆ L
1
E(|X − Y |) = |x − y| dy dx
L2 0 0
ˆ L ˆ x ˆ L
1
= (x − y) dy + (y − x) dy dx
L2 0 0 x
Chapter 6. Joint Distributions 14
ˆ L
L2
1
= + x2 − Lx dx
L2 0 2
L
= .
3
The expectation being a linear operator also applies to joint distributions. We look at
the case when X and Y are joint continuous random variables. Indeed, for a, b ∈ R,
ˆ ∞ˆ ∞
E(aX + bY ) = (ax + by) f (x, y) dx dy
ˆ ∞ˆ ∞
−∞ −∞
ˆ ∞ˆ ∞
= ax f (x, y) dy dx + by f (x, y) dx dy
ˆ ∞
−∞ −∞
ˆ ∞ −∞ −∞
an
= a x fX (x) dx + b y fY (y) dy
−∞ −∞
lim
= aE(X) + bE(Y ).
Di
Theorem 6.14 Let X and Y be jointly distributed random variables. If X and Y are
UP
independent, then
s,
Proof: We prove only the continuous case. Let f (x, y) be the joint pdf of X and Y . Since
M
∞ ∞
te
ˆ−∞
∞ ˆ−∞
∞
st
ˆ−∞
∞ −∞
ˆ ∞
= g(x)f (x) dx h(y)f (y)dy
ˆ−∞
∞
−∞
= E[g(X)]h(y)f (y) dy
−∞
= E[g(X)]E[h(Y )].
Suppose now that given X and Y jointly distributed, we want to identify the expectation
of say Y given a fixed value of X, vice versa. This gives rise to the notion of conditional
expectation.
Chapter 6. Joint Distributions 15
Definition 6.15 Let X and Y be jointly distributed random variables and g(Y ) a function
of Y . Then the conditional expectation of g(Y ) given X = x is defined as
X
g(y)p(y|x), X, Y discrete
y
E[g(Y )|X = x] =
ˆ ∞
g(y)f (y|x) dy, X, Y continuous
−∞
Examples:
1. Suppose the joint pdf of X and Y id f (x, y) = k, −1 < x < 1, x2 < y < 1, k
constant. Find E(Y |X = x).
Solution: Solving for the marginal density function of X,
an
ˆ 1
lim
fX (x) = k dy = k(1 − x2 ), −1 < x < 1.
Di
x2
Then,
UP
ˆ 1 ˆ 1
f (x, y) k 1
s,
fX (x) k − kx2 2
x2 x2
at
em
at
f (x, y) =
0,
otherwise
of
te
ˆ y
In
Then, ˆ y
y2
2 2 2x
E(X |Y = y) = x dx = , 0 < y < 1.
0 y2 2
Now we give a theorem involving conditional expectations that will be helpful in our
discussion of the conditional variance. The proof is left as an exercise for the reader.
Theorem 6.17 Let X and Y be jointly distributed random variables, and g a function of
Y . Then
an
E[E[g(Y )|X]] = E[g(Y )].
lim
Di
Definition 6.18 Let X and Y be jointly distributed, then the variance of Y given X = x
UP
is defined
ics,
X
[y − E(Y |x)]2 p(y|x), X, Y discrete
at
y
em
V ar[Y |X = x] = ˆ ∞
h
[(y − E(Y |x)]2 f (y|x) dy, X, Y continuous
at
−∞
M
Alternatively,
of
Theorem 6.19
The proof is again left as an exercise but will use the same argument as the proof of
the previous theorem.
Example: Let X and Y be jointly distributed with pdf
Then ˆ y
1 y
E(X|Y = y) = x dx = , 0 < y < 1.
0 y 2
ˆ y
y 2 1 y2
V ar(X|Y = y) = x− dx = , 0 < y < 1.
0 2 y 12
Exercises:
an
lim
a. Find k such that f is a joint pdf of X and Y .
Di
b. Find the marginal pdf of X and of Y .
c. Find the marginal means of X and of Y . UP
d. Find the marginal variances of X and of Y .
s,
pds of X and Y .
M
of
te
Covariance and correlation both measure how much two random variables change together.
In
Definition 6.20 Let two random variables X and Y have mean µX and µY , respectively,
and jointly distributed with pdf f (x, y). The covariance of X and Y , denoted σX,Y =
cov(X, Y ), is defined
Remark 6.21 Let X and Y be jointly distributed. We then have the following.
1. cov(X, Y ) = cov(Y, X)
Chapter 6. Joint Distributions 18
an
Now we state a theorem that relates the variances of two rv’s with their covariance.
lim
Theorem 6.22 V ar(X + Y ) = V ar(X) + V ar(Y ) + 2cov(X, Y ).
Di
Proof:
UP
s,
Definition 6.24 The correlation coefficient of two random variables X and Y , denoted
st
ρX,Y , is defined as
In
cov(X, Y )
ρX,Y = ,
σX σY
where σX and σY are the standard deviations of X and Y , respectively.
Examples:
1. Find cov(X, Y ) given the joint and marginal distribution values of X and Y .
Y \X -1 0 1 fY (x)
1 2 3 6
1 18 18 18 18
2 3 5
0 18
0 18 18
3 2 2 7
-1 18 18 18 18
6 4 8
fX (x) 18 18 14
1
Chapter 6. Joint Distributions 19
Solution:
1 3 3 2 3
E(XY ) = 1 −1 +0+1 + 0 + (−1) −1 +0+1 = .
18 18 18 18 18
1
X 6 8 2
E(X) = xfX (x) = −1 + 0 + 1 = .
x=−1
18 18 18
1
X 6 7 1
E(Y ) = yfY (y) = −1 + 0 + 1 =− .
y=−1
18 18 18
Then,
3 2 1 5
cov(X, Y ) = − − = .
18 18 18 18
an
lim
2. Let X and Y be continuous random variables with joint pdf
Di
f (x, y) = 2, 0 < y < 1 − x, 0 < x < 1.
UP
Find cov(X, Y ).
ic s,
Solution:
at
ˆ 1 ˆ 1−x
em
1
E(XY ) = 2xy dydx = − .
36
h
ˆ0 1 0ˆ 1−x
at
1
E(X) = x 2 dy dx = .
M
0 0 3
ˆ 1 ˆ 1−y
of
1
E(Y ) = y 2 dx dy = .
te
0 0 3
itu
st
In
Therefore,
1 1 5
cov(X, Y ) = − − =− .
36 9 36
Exercises:
1. An insurance policy pays a total medical benefit consisting of two parts for each
claim. Let X represent the part of the benefit that is to be paid to the doctor,
and let Y represent the part to be paid to the hospital. The variance of X is
5, 000, while the variance of Y is 10, 000, and the variance of X + Y is 17, 000.
Due to increasing medical costs, the insurance company decides to increase X
by a flat amount of 100 and increase Y by 10%. Calculate the variance of the
total benefit after the revisions.
2. Suppose X ∼ U (a, b) , with b > a. Find the values of a and b, if E[X] = 2 and
16
cov(X, X 2 ) = .
an
3
3. Suppose that X and Y are jointly continous random variables with joint pdf
lim
(
Di
k , x ≤ y ≤ 2 − x, x ∈ [0, 1]
f (x, y) = UP
0 , elsewhere
(
2x , x ∈ [0, 1]
h
fX (x) =
at
0 , elsewhere
M