You are on page 1of 120

Chapter 4: Pairs of Random Variables

(1)
NGUYỄN THỊ THU THỦY

SCHOOL OF APPLIED MATHEMATICS AND INFORMATICS


HANOI UNIVERSITY OF SCIENCE AND TECHNOLOGY

HANOI – 2019

(1)
Email: thuy.nguyenthithu2@hust.edu.vn
Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 1 / 74
Introduction

This chapter analyzes experiments that produce two random variables, X and Y .
1 FX,Y (x, y), the joint cumulative distribution function of two random variables.
2 PX,Y (x, y), the joint probability mass function for two discrete random variables.
3 fX,Y (x, y), the joint probability density function of two continuous random
variables.
4 Functions of two random variables.
5 Conditional probability and independence.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 2 / 74
Introduction

This chapter analyzes experiments that produce two random variables, X and Y .
1 FX,Y (x, y), the joint cumulative distribution function of two random variables.
2 PX,Y (x, y), the joint probability mass function for two discrete random variables.
3 fX,Y (x, y), the joint probability density function of two continuous random
variables.
4 Functions of two random variables.
5 Conditional probability and independence.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 2 / 74
Introduction

This chapter analyzes experiments that produce two random variables, X and Y .
1 FX,Y (x, y), the joint cumulative distribution function of two random variables.
2 PX,Y (x, y), the joint probability mass function for two discrete random variables.
3 fX,Y (x, y), the joint probability density function of two continuous random
variables.
4 Functions of two random variables.
5 Conditional probability and independence.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 2 / 74
Introduction

This chapter analyzes experiments that produce two random variables, X and Y .
1 FX,Y (x, y), the joint cumulative distribution function of two random variables.
2 PX,Y (x, y), the joint probability mass function for two discrete random variables.
3 fX,Y (x, y), the joint probability density function of two continuous random
variables.
4 Functions of two random variables.
5 Conditional probability and independence.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 2 / 74
Introduction

This chapter analyzes experiments that produce two random variables, X and Y .
1 FX,Y (x, y), the joint cumulative distribution function of two random variables.
2 PX,Y (x, y), the joint probability mass function for two discrete random variables.
3 fX,Y (x, y), the joint probability density function of two continuous random
variables.
4 Functions of two random variables.
5 Conditional probability and independence.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 2 / 74
Introduction

This chapter analyzes experiments that produce two random variables, X and Y .
1 FX,Y (x, y), the joint cumulative distribution function of two random variables.
2 PX,Y (x, y), the joint probability mass function for two discrete random variables.
3 fX,Y (x, y), the joint probability density function of two continuous random
variables.
4 Functions of two random variables.
5 Conditional probability and independence.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 2 / 74
4.1 Joint Cumulative Distribution Function 4.1.1 Definitions

4.1.1 Definitions

Definition 4.1 (Joint Cumulative Distribution Function – CDF)


The joint cumulative distribution function of random variables X and Y is

FX,Y (x, y) = P [X < x, Y < y], x, y ∈ R. (4.1)

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 3 / 74
4.1 Joint Cumulative Distribution Function 4.1.1 Definitions

4.1.1 Definitions

Definition 4.1 (Joint Cumulative Distribution Function – CDF)


The joint cumulative distribution function of random variables X and Y is

FX,Y (x, y) = P [X < x, Y < y], x, y ∈ R. (4.1)

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 3 / 74
4.1 Joint Cumulative Distribution Function 4.1.1 Definitions

4.1.1 Definitions

Definition 4.1 (Joint Cumulative Distribution Function – CDF)


The joint cumulative distribution function of random variables X and Y is

FX,Y (x, y) = P [X < x, Y < y], x, y ∈ R. (4.1)

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 3 / 74
4.1 Joint Cumulative Distribution Function 4.1.2 Properties

4.1.2 Properties

Remark 4.1
The joint CDF has properties that are direct consequences of the definition. For example,
we note that the event {X < x} suggests that Y can have any value so long as the
condition on X is met. This corresponds to the joint event {X < x, Y < ∞}. Therefore,

FX (x) = P [X < x] = P [X < x, Y < ∞] = lim FX,Y (x, y) = FX,Y (x, ∞). (4.2)
y→∞

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 4 / 74
4.1 Joint Cumulative Distribution Function 4.1.2 Properties

4.1.2 Properties

Remark 4.1
The joint CDF has properties that are direct consequences of the definition. For example,
we note that the event {X < x} suggests that Y can have any value so long as the
condition on X is met. This corresponds to the joint event {X < x, Y < ∞}. Therefore,

FX (x) = P [X < x] = P [X < x, Y < ∞] = lim FX,Y (x, y) = FX,Y (x, ∞). (4.2)
y→∞

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 4 / 74
4.1 Joint Cumulative Distribution Function 4.1.2 Properties

4.1.2 Properties

Remark 4.1
The joint CDF has properties that are direct consequences of the definition. For example,
we note that the event {X < x} suggests that Y can have any value so long as the
condition on X is met. This corresponds to the joint event {X < x, Y < ∞}. Therefore,

FX (x) = P [X < x] = P [X < x, Y < ∞] = lim FX,Y (x, y) = FX,Y (x, ∞). (4.2)
y→∞

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 4 / 74
4.1 Joint Cumulative Distribution Function 4.1.2 Properties

4.1.2 Properties

Theorem 4.1. For any pair of random variables, X, Y ,


(a) 0 ≤ FX,Y (x, y) ≤ 1,
(b) FX (x) = FX,Y (x, +∞),
(c) FY (y) = FX,Y (+∞, y),
(d) FX,Y (−∞, y) = FX,Y (x, −∞) = 0,
(e) If x < x1 and y < y1 , then FX,Y (x, y) ≤ FX,Y (x1 , y1 ),
(f) FX,Y (+∞, +∞) = 1,
(g) If x1 < x2 , y1 < y2 , then

P [x1 ≤ X < x2 , y1 ≤ Y < y2 ] = FX,Y (x2 , y2 ) − FX,Y (x2 , y1 ) − FX,Y (x1 , y2 )


+ FX,Y (x1 , y1 ).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 5 / 74
4.1 Joint Cumulative Distribution Function 4.1.2 Properties

4.1.2 Properties

Theorem 4.1. For any pair of random variables, X, Y ,


(a) 0 ≤ FX,Y (x, y) ≤ 1,
(b) FX (x) = FX,Y (x, +∞),
(c) FY (y) = FX,Y (+∞, y),
(d) FX,Y (−∞, y) = FX,Y (x, −∞) = 0,
(e) If x < x1 and y < y1 , then FX,Y (x, y) ≤ FX,Y (x1 , y1 ),
(f) FX,Y (+∞, +∞) = 1,
(g) If x1 < x2 , y1 < y2 , then

P [x1 ≤ X < x2 , y1 ≤ Y < y2 ] = FX,Y (x2 , y2 ) − FX,Y (x2 , y1 ) − FX,Y (x1 , y2 )


+ FX,Y (x1 , y1 ).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 5 / 74
4.1 Joint Cumulative Distribution Function 4.1.2 Properties

4.1.2 Properties

Theorem 4.1. For any pair of random variables, X, Y ,


(a) 0 ≤ FX,Y (x, y) ≤ 1,
(b) FX (x) = FX,Y (x, +∞),
(c) FY (y) = FX,Y (+∞, y),
(d) FX,Y (−∞, y) = FX,Y (x, −∞) = 0,
(e) If x < x1 and y < y1 , then FX,Y (x, y) ≤ FX,Y (x1 , y1 ),
(f) FX,Y (+∞, +∞) = 1,
(g) If x1 < x2 , y1 < y2 , then

P [x1 ≤ X < x2 , y1 ≤ Y < y2 ] = FX,Y (x2 , y2 ) − FX,Y (x2 , y1 ) − FX,Y (x1 , y2 )


+ FX,Y (x1 , y1 ).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 5 / 74
4.1 Joint Cumulative Distribution Function 4.1.2 Properties

4.1.2 Properties

Theorem 4.1. For any pair of random variables, X, Y ,


(a) 0 ≤ FX,Y (x, y) ≤ 1,
(b) FX (x) = FX,Y (x, +∞),
(c) FY (y) = FX,Y (+∞, y),
(d) FX,Y (−∞, y) = FX,Y (x, −∞) = 0,
(e) If x < x1 and y < y1 , then FX,Y (x, y) ≤ FX,Y (x1 , y1 ),
(f) FX,Y (+∞, +∞) = 1,
(g) If x1 < x2 , y1 < y2 , then

P [x1 ≤ X < x2 , y1 ≤ Y < y2 ] = FX,Y (x2 , y2 ) − FX,Y (x2 , y1 ) − FX,Y (x1 , y2 )


+ FX,Y (x1 , y1 ).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 5 / 74
4.1 Joint Cumulative Distribution Function 4.1.2 Properties

4.1.2 Properties

Theorem 4.1. For any pair of random variables, X, Y ,


(a) 0 ≤ FX,Y (x, y) ≤ 1,
(b) FX (x) = FX,Y (x, +∞),
(c) FY (y) = FX,Y (+∞, y),
(d) FX,Y (−∞, y) = FX,Y (x, −∞) = 0,
(e) If x < x1 and y < y1 , then FX,Y (x, y) ≤ FX,Y (x1 , y1 ),
(f) FX,Y (+∞, +∞) = 1,
(g) If x1 < x2 , y1 < y2 , then

P [x1 ≤ X < x2 , y1 ≤ Y < y2 ] = FX,Y (x2 , y2 ) − FX,Y (x2 , y1 ) − FX,Y (x1 , y2 )


+ FX,Y (x1 , y1 ).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 5 / 74
4.1 Joint Cumulative Distribution Function 4.1.2 Properties

4.1.2 Properties

Theorem 4.1. For any pair of random variables, X, Y ,


(a) 0 ≤ FX,Y (x, y) ≤ 1,
(b) FX (x) = FX,Y (x, +∞),
(c) FY (y) = FX,Y (+∞, y),
(d) FX,Y (−∞, y) = FX,Y (x, −∞) = 0,
(e) If x < x1 and y < y1 , then FX,Y (x, y) ≤ FX,Y (x1 , y1 ),
(f) FX,Y (+∞, +∞) = 1,
(g) If x1 < x2 , y1 < y2 , then

P [x1 ≤ X < x2 , y1 ≤ Y < y2 ] = FX,Y (x2 , y2 ) − FX,Y (x2 , y1 ) − FX,Y (x1 , y2 )


+ FX,Y (x1 , y1 ).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 5 / 74
4.1 Joint Cumulative Distribution Function 4.1.2 Properties

4.1.2 Properties

Theorem 4.1. For any pair of random variables, X, Y ,


(a) 0 ≤ FX,Y (x, y) ≤ 1,
(b) FX (x) = FX,Y (x, +∞),
(c) FY (y) = FX,Y (+∞, y),
(d) FX,Y (−∞, y) = FX,Y (x, −∞) = 0,
(e) If x < x1 and y < y1 , then FX,Y (x, y) ≤ FX,Y (x1 , y1 ),
(f) FX,Y (+∞, +∞) = 1,
(g) If x1 < x2 , y1 < y2 , then

P [x1 ≤ X < x2 , y1 ≤ Y < y2 ] = FX,Y (x2 , y2 ) − FX,Y (x2 , y1 ) − FX,Y (x1 , y2 )


+ FX,Y (x1 , y1 ).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 5 / 74
4.1 Joint Cumulative Distribution Function 4.1.2 Properties

4.1.2 Properties

Theorem 4.1. For any pair of random variables, X, Y ,


(a) 0 ≤ FX,Y (x, y) ≤ 1,
(b) FX (x) = FX,Y (x, +∞),
(c) FY (y) = FX,Y (+∞, y),
(d) FX,Y (−∞, y) = FX,Y (x, −∞) = 0,
(e) If x < x1 and y < y1 , then FX,Y (x, y) ≤ FX,Y (x1 , y1 ),
(f) FX,Y (+∞, +∞) = 1,
(g) If x1 < x2 , y1 < y2 , then

P [x1 ≤ X < x2 , y1 ≤ Y < y2 ] = FX,Y (x2 , y2 ) − FX,Y (x2 , y1 ) − FX,Y (x1 , y2 )


+ FX,Y (x1 , y1 ).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 5 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Definition 4.2 (Joint Probability Mass Function – PMF)


The joint probability mass function of discrete random variables X and Y is

PX,Y (x, y) = P [X = x, Y = y]. (4.3)

Note
Corresponding to SX , the range of a single discrete random variable, we use the
notation SX,Y to denote the set of possible values of the pair (X, Y ). That is,

SX,Y = {(x, y) | PX,Y (x, y) > 0}. (4.4)

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 6 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Definition 4.2 (Joint Probability Mass Function – PMF)


The joint probability mass function of discrete random variables X and Y is

PX,Y (x, y) = P [X = x, Y = y]. (4.3)

Note
Corresponding to SX , the range of a single discrete random variable, we use the
notation SX,Y to denote the set of possible values of the pair (X, Y ). That is,

SX,Y = {(x, y) | PX,Y (x, y) > 0}. (4.4)

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 6 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Definition 4.2 (Joint Probability Mass Function – PMF)


The joint probability mass function of discrete random variables X and Y is

PX,Y (x, y) = P [X = x, Y = y]. (4.3)

Note
Corresponding to SX , the range of a single discrete random variable, we use the
notation SX,Y to denote the set of possible values of the pair (X, Y ). That is,

SX,Y = {(x, y) | PX,Y (x, y) > 0}. (4.4)

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 6 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Definition 4.2 (Joint Probability Mass Function – PMF)


The joint probability mass function of discrete random variables X and Y is

PX,Y (x, y) = P [X = x, Y = y]. (4.3)

Note
Corresponding to SX , the range of a single discrete random variable, we use the
notation SX,Y to denote the set of possible values of the pair (X, Y ). That is,

SX,Y = {(x, y) | PX,Y (x, y) > 0}. (4.4)

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 6 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Definition 4.3 (Joint Probability Distribution Table – PDT)


P
X /Y y1 ... yj ... yn
j
x1 p11 ... p1j ... p1n P [X = x1 ]
.. .. .. .. .. .. ..
. . . . . . .
xi pi1 ... pij ... pin P [X = xi ]
.. .. .. .. .. .. ..
. . . . . . .
xm pm1 ... pmj ... pmn P [X = xm ]
P
P [Y = y1 ] ... P [Y = yj ] ... P [Y = yn ] 1
i

where
pij = P [X = xi , Y = yj ], i = 1, . . . , m, j = 1, . . . , n,

n
X m
X
P [X = xi ] = pij , i = 1, . . . , m, P [Y = yj ] = pij , j = 1, . . . , n.
j=1 i=1

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 7 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Definition 4.3 (Joint Probability Distribution Table – PDT)


P
X /Y y1 ... yj ... yn
j
x1 p11 ... p1j ... p1n P [X = x1 ]
.. .. .. .. .. .. ..
. . . . . . .
xi pi1 ... pij ... pin P [X = xi ]
.. .. .. .. .. .. ..
. . . . . . .
xm pm1 ... pmj ... pmn P [X = xm ]
P
P [Y = y1 ] ... P [Y = yj ] ... P [Y = yn ] 1
i

where
pij = P [X = xi , Y = yj ], i = 1, . . . , m, j = 1, . . . , n,

n
X m
X
P [X = xi ] = pij , i = 1, . . . , m, P [Y = yj ] = pij , j = 1, . . . , n.
j=1 i=1

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 7 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Theorem 4.2
(a) 0 ≤ pij ≤ 1 for all i = 1, . . . , m, j = 1, . . . , n.
Pm Pn
(b) i=1 j=1 pij = 1.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 8 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Remark 4.3
From (4.1) and Definition 4.3, the joint cumulative distribution function of discrete
random variables X and Y is
X X
FXY (x, y) = pij (4.5)
xi <x yj <y

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 9 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Example 4.1
Test two integrated circuits one after the other. On each test, the possible outcomes are
a (accept) and r (reject). Assume that all circuits are acceptable with probability 0.9
and that the outcomes of successive tests are independent. Count the number of
acceptable circuits X and count the number of successful tests Y before you observe the
first reject. (If both tests are successful, let Y = 2.)
(a) Find the joint PMF of X and Y .
(b) Find the joint PDT of X and Y .

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 10 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Example 4.1 Solution


The sample space of the experiment is

S = {aa, ar, ra, rr}.

We compute

P [aa] = 0.81, P [ar] = P [ra] = 0.09, P [rr] = 0.01.

Each outcome specifies a pair of values X and Y . Let g(s) be the function that
transforms each outcome s in the sample space S into the pair of random variables
(X, Y ). Then

g(aa) = (2, 2), g(ar) = (1, 1), g(ra) = (1, 0), g(rr) = (0, 0).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 11 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Example 4.1 Solution (continuous)


For each pair of values x, y, PX,Y (x, y) is the sum of the probabilities of the outcomes
for which X = x and Y = y. For example, PX,Y (1, 1) = P [ar]. The joint PMF can be
given as a set of labeled points in the x, y plane where each point is a possible value
(probability > 0) of the pair (x, y), or as a simple list:

0.81,


x = 2, y = 2,
0.09, x = 1, y = 1,




PX,Y (x, y) = 0.09, x = 1, y = 0,

0.01, x = 0, y = 0,





0, otherwise.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 12 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Example 4.1 Solution


A second representation of PX,Y (x, y) is the matrix:

PX,Y (x, y) y=0 y=1 y=2


x=0 0.01 0 0
x=1 0.09 0.09 0
x=2 0 0 0.81

Note
Note that all of the probabilities add up to 1. This reflects the second axiom of
probability that states P [S] = 1. Using the notation of random variables, we write this as
X X
PX,Y (x, y) = 1. (4.6)
x∈SX y∈SY

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 13 / 74
4.2 Joint Probability Mass Function 4.2.1 Definitions

4.2.1 Definitions

Example 4.1 Solution


A second representation of PX,Y (x, y) is the matrix:

PX,Y (x, y) y=0 y=1 y=2


x=0 0.01 0 0
x=1 0.09 0.09 0
x=2 0 0 0.81

Note
Note that all of the probabilities add up to 1. This reflects the second axiom of
probability that states P [S] = 1. Using the notation of random variables, we write this as
X X
PX,Y (x, y) = 1. (4.6)
x∈SX y∈SY

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 13 / 74
4.2 Joint Probability Mass Function 4.2.2 Properties

4.2.2 Properties

Note
We represent an event B as a region in the X, Y plane. Figure 4.2 shows two examples
of events.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 14 / 74
4.2 Joint Probability Mass Function 4.2.2 Properties

4.2.2 Properties

Note
We represent an event B as a region in the X, Y plane. Figure 4.2 shows two examples
of events.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 14 / 74
4.2 Joint Probability Mass Function 4.2.2 Properties

4.2.2 Properties

Theorem 4.3
For discrete random variables X and Y and any set B in the X, Y plane, the probability
of the event {(X, Y ) ∈ B} is
X
P [B] = PX,Y (x, y). (4.7)
(x,y)∈B

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 15 / 74
4.2 Joint Probability Mass Function 4.2.2 Properties

4.2.2 Properties

Example 4.2
Continuing Example 4.1, find the probability of the event B that X, the number of
acceptable circuits, equals Y , the number of tests before observing the first failure.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 16 / 74
4.2 Joint Probability Mass Function 4.2.2 Properties

4.2.2 Properties

Example 4.2 Solution


Mathematically, B is the event {X = Y }. The elements of B with nonzero probability
are

B ∩ SX,Y = {(0, 0), (1, 1), (2, 2)}.

Therefore,

P [B] = PX,Y (0, 0) + PX,Y (1, 1) + PX,Y (2, 2)


= 0.01 + 0.09 + 0.81 = 0.91.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 17 / 74
4.2 Joint Probability Mass Function 4.2.3 Marginal PMF

4.2.3 Marginal PMF

Theorem 4.4
For discrete random variables X and Y with joint PMF PX,Y (x, y),
X X
PX (x) = PX,Y (x, y), PY (y) = PX,Y (x, y). (4.8)
y∈SY x∈SX

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 18 / 74
4.2 Joint Probability Mass Function 4.2.3 Marginal PMF

4.2.3 Marginal PMF

Remark 4.4. From Definition 4.3,


(a) Marginal CDT of X:
X x1 x2 ... xm
p P [X = x1 ] P [X = x2 ] ... P [X = xm ]
(b) Marginal CDT of Y :
Y y1 y2 ... yn
p P [Y = y1 ] P [Y = y2 ] ... P [Y = yn ]

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 19 / 74
4.2 Joint Probability Mass Function 4.2.3 Marginal PMF

4.2.3 Marginal PMF

Remark 4.4. From Definition 4.3,


(a) Marginal CDT of X:
X x1 x2 ... xm
p P [X = x1 ] P [X = x2 ] ... P [X = xm ]
(b) Marginal CDT of Y :
Y y1 y2 ... yn
p P [Y = y1 ] P [Y = y2 ] ... P [Y = yn ]

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 19 / 74
4.2 Joint Probability Mass Function 4.2.3 Marginal PMF

4.2.3 Marginal PMF

Example 4.3
In Example 4.1, we found the joint PMF of X and Y to be

PX,Y (x, y) y=0 y=1 y=2


x=0 0.01 0 0
x=1 0.09 0.09 0
x=2 0 0 0.81

(a) Find the marginal PMFs for the random variables X and Y .
(b) Find the marginal PDTs for the random variables X and Y .

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 20 / 74
4.2 Joint Probability Mass Function 4.2.3 Marginal PMF

4.2.3 Marginal PMF

Example 4.3 Solution


To find PX (x), we note that both X and Y have range {0, 1, 2}. Theorem 4.3 gives
2
X 2
X
PX (0) = PX,Y (0, y) = 0.01, PX (1) = PX,Y (1, y) = 0.18
y=0 y=0
2
X
PX (2) = PX,Y (2, y) = 0.81, PX (x) = 0, x 6= 0, 1, 2.
y=0

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 21 / 74
4.2 Joint Probability Mass Function 4.2.3 Marginal PMF

4.2.3 Marginal PMF

Example 4.3 Solution


For the PMF of Y , we obtain
2
X 2
X
PY (0) = PX,Y (x, 0) = 0.10, PY (1) = PX,Y (x, 1) = 0.09
x=0 x=0
2
X
PY (2) = PX,Y (x, 2) = 0.81, PY (y) = 0, y 6= 0, 1, 2.
x=0

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 22 / 74
4.2 Joint Probability Mass Function 4.2.3 Marginal PMF

4.2.3 Marginal PMF

Example 4.3 Solution


We display PX (x) and PY (y) by rewriting the matrix in Example 4.1 and placing the
row sums and column sums in the margins.

PX,Y (x, y) y=0 y=1 y=2 PX (x)


x=0 0.01 0 0 0.01
x=1 0.09 0.09 0 0.18
x=2 0 0 0.81 0.81
PY (y) 0.10 0.09 0.81

 

0.01, x = 0, 
 0.1, y = 0,

0.18, 
x = 1, 0.09, y = 1,
PX (x) = PY (y) =


0.81, x = 2, 

 0.81, y = 2,
 
0, otherwise. 0, otherwise.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 23 / 74
4.3 Joint Probability Density Function 4.3.1 Definitions

4.3.1 Definitions

Definition 4.4 (Joint Probability Density Function – PDF)


The joint PDF of the continuous random variables X and Y is a function fX,Y (x, y)
with the property
Zx Zy
FX,Y (x, y) = fX,Y (u, v)dvdu. (4.8)
−∞ −∞

Theorem 4.5

∂ 2 FX,Y (x, y)
fX,Y (x, y) = . (4.9)
∂x∂y

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 24 / 74
4.3 Joint Probability Density Function 4.3.1 Definitions

4.3.1 Definitions

Definition 4.4 (Joint Probability Density Function – PDF)


The joint PDF of the continuous random variables X and Y is a function fX,Y (x, y)
with the property
Zx Zy
FX,Y (x, y) = fX,Y (u, v)dvdu. (4.8)
−∞ −∞

Theorem 4.5

∂ 2 FX,Y (x, y)
fX,Y (x, y) = . (4.9)
∂x∂y

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 24 / 74
4.3 Joint Probability Density Function 4.3.2 Properties

4.3.2 Properties

Theorem 4.7
A joint PDF fX,Y (x, y) has the following properties corresponding to first and second
axioms of probability:
(a) fX,Y (x, y) ≥ 0 for all (x, y) ∈ R2 ,
Z +∞ Z +∞
(b) fX,Y (x, y)dxdy = 1.
−∞ −∞

Theorem 4.8
The probability that the continuous random variables (X, Y ) are in A is
ZZ
P [A] = fX,Y (x, y)dxdy. (4.10)
A

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 25 / 74
4.3 Joint Probability Density Function 4.3.2 Properties

4.3.2 Properties

Theorem 4.7
A joint PDF fX,Y (x, y) has the following properties corresponding to first and second
axioms of probability:
(a) fX,Y (x, y) ≥ 0 for all (x, y) ∈ R2 ,
Z +∞ Z +∞
(b) fX,Y (x, y)dxdy = 1.
−∞ −∞

Theorem 4.8
The probability that the continuous random variables (X, Y ) are in A is
ZZ
P [A] = fX,Y (x, y)dxdy. (4.10)
A

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 25 / 74
4.3 Joint Probability Density Function 4.3.2 Properties

4.3.2 Properties
Example 4.4
Random variables X and Y have joint PDF
(
c, 0 ≤ x ≤ 5, 0 ≤ y ≤ 3
fX,Y (x, y) =
0, otherwise.

Find the constant c and P [A] = P [2 ≤ X < 3, 1 ≤ Y < 3].

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 26 / 74
4.3 Joint Probability Density Function 4.3.2 Properties

4.3.2 Properties

Example 4.4 Solution


The large rectangle in the diagram is the area of nonzero probability. Theorem 4.5
states that the integral of the joint PDF over this rectangle is 1:
Z5 Z3
1= cdydx = 15c.
0 0

Therefore, c = 1/15.
The small dark rectangle in the diagram is the event
A = {2 ≤ X < 3, 1 ≤ Y < 3}. P [A] is the integral of the PDF over this
rectangle, which is
Z3 Z3
1
P [A] = dvdu = 2/15.
15
2 1

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 27 / 74
4.3 Joint Probability Density Function 4.3.2 Properties

4.3.2 Properties

Example 4.4 Solution


The large rectangle in the diagram is the area of nonzero probability. Theorem 4.5
states that the integral of the joint PDF over this rectangle is 1:
Z5 Z3
1= cdydx = 15c.
0 0

Therefore, c = 1/15.
The small dark rectangle in the diagram is the event
A = {2 ≤ X < 3, 1 ≤ Y < 3}. P [A] is the integral of the PDF over this
rectangle, which is
Z3 Z3
1
P [A] = dvdu = 2/15.
15
2 1

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 27 / 74
4.3 Joint Probability Density Function 4.3.3 Marginal PDF

4.3.3 Marginal PDF

Theorem 4.9
If X and Y are random variables with joint PDF fX,Y (x, y),
+∞
Z +∞
Z
fX (x) = fX,Y (x, y)dy, fY (y) = fX,Y (x, y)dx. (4.11)
−∞ −∞

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 28 / 74
4.3 Joint Probability Density Function 4.3.3 Marginal PDF

4.3.3 Marginal PDF


Example 4.5
The joint PDF of X and Y is

 5y , −1 ≤ x ≤ 1, x2 ≤ y ≤ 1
fX,Y (x, y) = 4
0, otherwise.

Find the marginal PDFs fX (x) and fY (y).


y

−1 O 1 x

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 29 / 74
4.3 Joint Probability Density Function 4.3.3 Marginal PDF

4.3.3 Marginal PDF

Example 4.5 Solution


We use Theorem 4.7 to find the marginal PDF fX (x).
When x < −1 or when x > 1, fX,Y (x, y) = 0, and therefore fX (x) = 0.
For −1 ≤ x ≤ 1,
Z1
5y 5(1 − x4 )
fX (x) = dy = .
4 8
x2

The complete expression for the marginal PDF of X is

 5(1 − x4 )

, −1 ≤ x ≤ 1,
fX (x) = 8
0, otherwise.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 30 / 74
4.3 Joint Probability Density Function 4.3.3 Marginal PDF

4.3.3 Marginal PDF

Example 4.5 Solution (continuous)


For the marginal PDF of Y , we note that for y < 0 or y > 1, fY (y) = 0.
For 0 ≤ y ≤ 1, we integrate over the horizontal bar marked Y = y. The boundaries
√ √
of the bar are x = − y and x = y. Therefore, for 0 ≤ y ≤ 1,

Zy √
5y 5y x= y 5y 3/2
fY (y) = dx = x √ = .

4 4 x=− y 2
− y

The complete marginal PDF of Y is


 3/2
 5y
, 0 ≤ y ≤ 1,
fY (y) = 2
0, otherwise.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 31 / 74
4.3 Joint Probability Density Function 4.3.3 Marginal PDF

4.3.3 Marginal PDF


Example 4.6
The joint PDF of X and Y is
(
kx, if 0 < y < x < 1
fX,Y (x, y) =
0, otherwise.

(a) Find constant k.


(b) Find the marginal PDFs fX (x) and fY (y).
y

O
1 x

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 32 / 74
4.3 Joint Probability Density Function 4.3.3 Marginal PDF

4.3.3 Marginal PDF

Example 4.6 Solution


Set D = {(x, y) ∈ R2 : 0 < x < 1; 0 < y < x}.
(a) From Theorem 4.7, k ≥ 0 and
+∞ Z
Z +∞ Z1 Zx Z1
k
1= fX,Y (x, y)dxdy = dx kxdy = k x2 dx = .
3
−∞ −∞ 0 0 0
(
3x, if (x, y) ∈ D,
Hence k = 3 and fX,Y (x, y) =
0, otherwise.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 33 / 74
4.3 Joint Probability Density Function 4.3.3 Marginal PDF

4.3.3 Marginal PDF

Example 4.6 Solution (continuous)


(b) From Theorem 4.7,
Zx
+∞  (
3x2 ,

 3xdy, 0<x<1
Z
0<x<1
fX (x) = fX,Y (x, y)dy = =

 0 0, otherwise
−∞
0, otherwise.

 1
Z 
+∞
 3 − 3 y2 ,


Z  3xdx, 0<y<1 0<y<1
fY (y) = fX,Y (x, y)dx = = 2 2
y
−∞



 0, otherwise.
0, otherwise

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 34 / 74
4.3 Joint Probability Density Function 4.3.3 Marginal PDF

4.3.3 Marginal PDF

Example 4.6 Solution (continuous)


(b) From Theorem 4.7,
Zx
+∞  (
3x2 ,

 3xdy, 0<x<1
Z
0<x<1
fX (x) = fX,Y (x, y)dy = =

 0 0, otherwise
−∞
0, otherwise.

 1
Z 
+∞
 3 − 3 y2 ,


Z  3xdx, 0<y<1 0<y<1
fY (y) = fX,Y (x, y)dx = = 2 2
y
−∞



 0, otherwise.
0, otherwise

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 34 / 74
4.4 Expected Values 4.4.1 Expected Values

4.4.1 Expected Values

Theorem 4.10
For random variables X and Y , the expected value of W = g(X, Y ) is
X X
Discrete: E[W ] = g(x, y)PX,Y (x, y),
x∈SX y∈SY
XX
or E[W ] = g(xi , yj )P [X = xi , Y = yj ],
i j
Z∞ Z∞
Continuous: E[W ] = g(x, y)fX,Y (x, y)dxdy.
−∞ −∞

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 35 / 74
4.4 Expected Values 4.4.1 Expected Values

4.4.1 Expected Values

Theorem 4.10
For random variables X and Y , the expected value of W = g(X, Y ) is
X X
Discrete: E[W ] = g(x, y)PX,Y (x, y),
x∈SX y∈SY
XX
or E[W ] = g(xi , yj )P [X = xi , Y = yj ],
i j
Z∞ Z∞
Continuous: E[W ] = g(x, y)fX,Y (x, y)dxdy.
−∞ −∞

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 35 / 74
4.4 Expected Values 4.4.1 Expected Values

4.4.1 Expected Values

Theorem 4.10
For random variables X and Y , the expected value of W = g(X, Y ) is
X X
Discrete: E[W ] = g(x, y)PX,Y (x, y),
x∈SX y∈SY
XX
or E[W ] = g(xi , yj )P [X = xi , Y = yj ],
i j
Z∞ Z∞
Continuous: E[W ] = g(x, y)fX,Y (x, y)dxdy.
−∞ −∞

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 35 / 74
4.4 Expected Values 4.4.1 Expected Values

4.4.1 Expected Values

Theorem 4.11

E[g1 (X, Y ) + . . . + gn (X, Y )] = E[g1 (X, Y )] + . . . + E[gn (X, Y )].

Theorem 4.12
For any two random variables X and Y ,

E[X + Y ] = E[X] + E[Y ].

Theorem 4.13
The variance of the sum of two random variables is

V ar[X + Y ] = V ar[X] + V ar[Y ] + 2E[(X − µX )(Y − µY )].

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 36 / 74
4.4 Expected Values 4.4.1 Expected Values

4.4.1 Expected Values

Theorem 4.11

E[g1 (X, Y ) + . . . + gn (X, Y )] = E[g1 (X, Y )] + . . . + E[gn (X, Y )].

Theorem 4.12
For any two random variables X and Y ,

E[X + Y ] = E[X] + E[Y ].

Theorem 4.13
The variance of the sum of two random variables is

V ar[X + Y ] = V ar[X] + V ar[Y ] + 2E[(X − µX )(Y − µY )].

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 36 / 74
4.4 Expected Values 4.4.1 Expected Values

4.4.1 Expected Values

Theorem 4.11

E[g1 (X, Y ) + . . . + gn (X, Y )] = E[g1 (X, Y )] + . . . + E[gn (X, Y )].

Theorem 4.12
For any two random variables X and Y ,

E[X + Y ] = E[X] + E[Y ].

Theorem 4.13
The variance of the sum of two random variables is

V ar[X + Y ] = V ar[X] + V ar[Y ] + 2E[(X − µX )(Y − µY )].

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 36 / 74
4.4 Expected Values 4.4.2 Covariance

(a) Covariance

Definition 4.5 (Covariance)


The covariance of two random variables X and Y is

Cov[X, Y ] = E[(X − E[X])(Y − E[Y ])].

Remark 4.6
From the properites of the expected value,

Cov(X, Y ) = E[XY ] − E[X].E[Y ] (4.12)

where E[XY ] is
P P

 xi yj pij , (Discrete Random Variables)
i j

+∞ +∞
E[XY ] = Z Z


 xyf (x, y)dxdy, (Continuous Random Variables).

−∞ −∞

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 37 / 74
4.4 Expected Values 4.4.2 Covariance

(a) Covariance

Definition 4.5 (Covariance)


The covariance of two random variables X and Y is

Cov[X, Y ] = E[(X − E[X])(Y − E[Y ])].

Remark 4.6
From the properites of the expected value,

Cov(X, Y ) = E[XY ] − E[X].E[Y ] (4.12)

where E[XY ] is
P P

 xi yj pij , (Discrete Random Variables)
i j

+∞ +∞
E[XY ] = Z Z


 xyf (x, y)dxdy, (Continuous Random Variables).

−∞ −∞

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 37 / 74
4.4 Expected Values 4.4.2 Covariance

Properties

Theorem 4.14
(a) Cov(X, Y ) = E[XY ] − E[X].E[Y ] = rX,Y − µX µY .
(b) V ar[X + Y ] = V ar[X] + V ar[Y ] + 2Cov[X, Y ].
(c) V ar[X] = Cov[X, X], V ar[Y ] = Cov[Y, Y ].

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 38 / 74
4.4 Expected Values 4.4.2 Covariance

(b) Covariance Matrix

Definition 4.6 (Covariance Matrix)


The covariance matrix of two random variables X and Y is
   
Cov(X, X) Cov(X, Y ) V ar(X) Cov(X, Y )
Γ= = (4.13)
Cov(Y, X) Cov(Y, Y ) Cov(X, Y ) V ar(Y )

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 39 / 74
4.4 Expected Values 4.4.2 Covariance

(c) Correlation

Definition 4.7 (Correlation)


The correlation of X and Y is rX,Y = E[XY ], where
 P P

 xyPX,Y (x, y), (Discrete Random Variables)
x∈SX y∈SY

+∞ +∞
E[XY ] = Z Z


 xyfX,Y (x, y)dxdy, (Continuous Random Variables).

−∞ −∞

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 40 / 74
4.4 Expected Values 4.4.2 Covariance

Examples

Example 4.7
For the integrated circuits tests in Example 4.1, we found in Example 4.3 that the
probability model for X and Y is given by the following matrix.

PX,Y (x, y) y=0 y=1 y=2 PX (x)


x=0 0.01 0 0 0.01
x=1 0.09 0.09 0 0.18
x=2 0 0 0.81 0.81
PY (y) 0.10 0.09 0.81

Find rX,Y and Cov[X, Y ].

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 41 / 74
4.4 Expected Values 4.4.2 Covariance

Examples

Example 4.7 Solution


By Definition 4.7,
2 X
X 2
rX,Y = E[XY ] = xyPX,Y (x, y) = 1 × 1 × 0.09 + 2 × 2 × 0.81 = 3.33.
x=0 y=0

To use Theorem 4.14(a) to find the covariance, we find

E[X] = 1 × 0.18 + 2 × 0.81 = 1.80


E[Y ] = 1 × 0.09 + 2 × 0.81 = 1.71.

Therefore, by Theorem 4.14(a), Cov[X, Y ] = 3.33 − 1.80 × 1.71 = 0.252.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 42 / 74
4.4 Expected Values 4.4.2 Covariance

Examples

Example 4.7 Solution


By Definition 4.7,
2 X
X 2
rX,Y = E[XY ] = xyPX,Y (x, y) = 1 × 1 × 0.09 + 2 × 2 × 0.81 = 3.33.
x=0 y=0

To use Theorem 4.14(a) to find the covariance, we find

E[X] = 1 × 0.18 + 2 × 0.81 = 1.80


E[Y ] = 1 × 0.09 + 2 × 0.81 = 1.71.

Therefore, by Theorem 4.14(a), Cov[X, Y ] = 3.33 − 1.80 × 1.71 = 0.252.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 42 / 74
4.4 Expected Values 4.4.2 Covariance

Examples

Example 4.7 Solution


By Definition 4.7,
2 X
X 2
rX,Y = E[XY ] = xyPX,Y (x, y) = 1 × 1 × 0.09 + 2 × 2 × 0.81 = 3.33.
x=0 y=0

To use Theorem 4.14(a) to find the covariance, we find

E[X] = 1 × 0.18 + 2 × 0.81 = 1.80


E[Y ] = 1 × 0.09 + 2 × 0.81 = 1.71.

Therefore, by Theorem 4.14(a), Cov[X, Y ] = 3.33 − 1.80 × 1.71 = 0.252.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 42 / 74
4.4 Expected Values 4.4.2 Covariance

Examples

Example 4.8
The joint PDT of X and Y is

X Y −1 0 1
−1 4/15 1/15 4/15

0 1/15 2/15 1/15

1 0 2/15 0

(a) Find the PDT of X and Y .


(b) Find E[X], E[Y ], Cov[X, Y ].

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 43 / 74
4.4 Expected Values 4.4.2 Covariance

Examples

Example 4.8 Solution


(a) The PDT of X and Y are

X -1 0 1 Y -1 0 1
P 9/15 4/15 5/15 P 5/15 5/15 5/15

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 44 / 74
4.4 Expected Values 4.4.2 Covariance

Examples

Example 4.9 Solution (continuous)


(b) We have
9 4 2 7
E[X] = (−1) × +0× +1× =− .
15 15 15 15
5 5 5
E[Y ] = (−1) × +0× +1× = 0.
15 15 15
4 4
E[XY ] = (−1) × (−1) × + (−1) × (1) × + 1 × (−1) × 0 + 1 × 1 × 0 = 0.
15 15
Hence Cov[X, Y ] = E[XY ] − E[X] × E[Y ] = 0.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 45 / 74
4.4 Expected Values 4.4.2 Covariance

(d) Uncorrelated Random Variables

Definition 4.6 (Orthogonal Random Variables)


Random variables X and Y are orthogonal if rX,Y = 0.

Definition 4.7 (Uncorrelated Random Variables)


Random variables X and Y are uncorrelated if Cov[X, Y ] = 0.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 46 / 74
4.4 Expected Values 4.4.2 Covariance

(d) Uncorrelated Random Variables

Definition 4.6 (Orthogonal Random Variables)


Random variables X and Y are orthogonal if rX,Y = 0.

Definition 4.7 (Uncorrelated Random Variables)


Random variables X and Y are uncorrelated if Cov[X, Y ] = 0.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 46 / 74
4.4 Expected Values 4.4.3 Correlation Coefficient

4.4.3 Correlation Coefficient

Definition 4.8 (Correlation Coefficient)


The correlation coefficient of two random variables X and Y is
Cov[X, Y ] Cov[X, Y ]
ρX,Y = p = . (4.14)
V ar[X]V ar[Y ] σX σY

Theorem 4.17

−1 ≤ ρX,Y ≤ 1.

Theorem 4.18
If X and Y are random variables such that Y = aX + b

−1,
 a < 0,
ρX,Y = 0, a = 0,

1, a > 0.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 47 / 74
4.4 Expected Values 4.4.3 Correlation Coefficient

4.4.3 Correlation Coefficient

Definition 4.8 (Correlation Coefficient)


The correlation coefficient of two random variables X and Y is
Cov[X, Y ] Cov[X, Y ]
ρX,Y = p = . (4.14)
V ar[X]V ar[Y ] σX σY

Theorem 4.17

−1 ≤ ρX,Y ≤ 1.

Theorem 4.18
If X and Y are random variables such that Y = aX + b

−1,
 a < 0,
ρX,Y = 0, a = 0,

1, a > 0.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 47 / 74
4.4 Expected Values 4.4.3 Correlation Coefficient

4.4.3 Correlation Coefficient

Definition 4.8 (Correlation Coefficient)


The correlation coefficient of two random variables X and Y is
Cov[X, Y ] Cov[X, Y ]
ρX,Y = p = . (4.14)
V ar[X]V ar[Y ] σX σY

Theorem 4.17

−1 ≤ ρX,Y ≤ 1.

Theorem 4.18
If X and Y are random variables such that Y = aX + b

−1,
 a < 0,
ρX,Y = 0, a = 0,

1, a > 0.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 47 / 74
4.5 Conditioning by an Event 4.5.1 Conditional Joint PMF

4.5.1 Conditional Joint PMF

Definition 4.9 (Conditional Joint PMF)


For discrete random variables X and Y and an event, B with P [B] > 0, the conditional
joint PMF of X and Y given B is

PX,Y |B (x, y) = P [X = x, Y = y|B]. (4.15)

Theorem 4.19
For any event B, a region of the X, Y plane with P [B] > 0,

 PX,Y (x, y) , (x, y) ∈ B,
PX,Y |B (x, y) = P [B] (4.16)
0, otherwise.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 48 / 74
4.5 Conditioning by an Event 4.5.1 Conditional Joint PMF

4.5.1 Conditional Joint PMF

Definition 4.9 (Conditional Joint PMF)


For discrete random variables X and Y and an event, B with P [B] > 0, the conditional
joint PMF of X and Y given B is

PX,Y |B (x, y) = P [X = x, Y = y|B]. (4.15)

Theorem 4.19
For any event B, a region of the X, Y plane with P [B] > 0,

 PX,Y (x, y) , (x, y) ∈ B,
PX,Y |B (x, y) = P [B] (4.16)
0, otherwise.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 48 / 74
4.5 Conditioning by an Event 4.5.2 Conditional Joint PDF

4.5.2 Conditional Joint PDF

Definition 4.10 (Conditional Joint PDF)


Given an event B with P [B] > 0, the conditional joint probability density function of X
and Y is

 fX,Y (x, y) , (x, y) ∈ B,
fX,Y |B (x, y) = P [B] (4.17)
0, otherwise.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 49 / 74
4.5 Conditioning by an Event 4.5.2 Conditional Joint PDF

4.5.2 Conditional Joint PDF

Example 4.9
X and Y are random variables with joint PDF

1, 0 ≤ x ≤ 5, 0 ≤ y ≤ 3,
fX,Y (x, y) = 15
0, otherwise.

Find the conditional PDF of X and Y given the event B = {X + Y ≥ 4}.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 50 / 74
4.5 Conditioning by an Event 4.5.2 Conditional Joint PDF

4.5.2 Conditional Joint PDF

Example 4.9 Solution


We calculate P [B] by integrating fX,Y (x, y) over the region B.
Z3 Z5 Z3
1 1 1
P [B] = dxdy = (1 + y)dy = .
15 15 2
0 4−y 0

Definition 4.10 leads to the conditional joint PDF



2, 0 ≤ x ≤ 5, 0 ≤ y ≤ 3, x + y ≥ 4,
fX,Y |B (x, y) = 15
0, otherwise.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 51 / 74
4.5 Conditioning by an Event 4.5.3 Conditional Expected Value

4.5.3 Conditional Expected Value

Theorem 4.20 (Conditional Expected Value)


For random variables X and Y and an event B of nonzero probability, the conditional
expected value of W = g(X, Y ) given B is
X X
Discrete: E[W |B] = g(x, y)PX,Y |B (x, y),
x∈SX y∈SY
+∞ Z
Z +∞

Continuous: E[W |B] = g(x, y)fX,Y |B (x, y)dxdy.


−∞ −∞

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 52 / 74
4.5 Conditioning by an Event 4.5.3 Conditional Expected Value

4.5.3 Conditional Expected Value

Example 4.10
Continuing Example 4.9, find the conditional expected value of W = XY given the
event B = {X + Y ≥ 4}.

Example 4.10 Solution


From Theorem 4.20,
Z3 Z5 Z3 Z3
2 1 1
5
E[XY |B] = xydxdy = x2 ydy = (9y + 8y 2 − y 3 )dy

15 15 4−y 15
0 4−y 0 0
123
= .
20

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 53 / 74
4.5 Conditioning by an Event 4.5.3 Conditional Expected Value

4.5.3 Conditional Expected Value

Example 4.10
Continuing Example 4.9, find the conditional expected value of W = XY given the
event B = {X + Y ≥ 4}.

Example 4.10 Solution


From Theorem 4.20,
Z3 Z5 Z3 Z3
2 1 1
5
E[XY |B] = xydxdy = x2 ydy = (9y + 8y 2 − y 3 )dy

15 15 4−y 15
0 4−y 0 0
123
= .
20

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 53 / 74
4.5 Conditioning by an Event 4.5.4 Conditional Variance

4.5.4 Conditional Variance

Definition 4.11 (Conditional Variance)


The conditional variance of the random variable W = g(X, Y ) is

V ar[W |B] = E([(W − E[W |B])2 |B].

Theorem 4.21

V ar[W |B] = E[W 2 |B] − (E[W |B])2 .

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 54 / 74
4.5 Conditioning by an Event 4.5.4 Conditional Variance

4.5.4 Conditional Variance

Definition 4.11 (Conditional Variance)


The conditional variance of the random variable W = g(X, Y ) is

V ar[W |B] = E([(W − E[W |B])2 |B].

Theorem 4.21

V ar[W |B] = E[W 2 |B] − (E[W |B])2 .

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 54 / 74
4.6 Conditioning by a Random Variable 4.6.1 Conditional PMF

4.6.1 Conditional PMF

Definition 4.12 (Conditional PMF)


For any event Y = y such that PY (y) > 0, the conditional PMF of X given Y = y is

PX|Y (x|y) = P [X = x|Y = y]. (4.18)

Theorem 4.22
For random variables X and Y with joint PMF PX,Y (x, y), and x and y such that
PX (x) > 0 and PY (y) > 0,

PX,Y (x, y) = PX|Y (x|y)PY (y) = PY |X (y|x)PX (x).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 55 / 74
4.6 Conditioning by a Random Variable 4.6.1 Conditional PMF

4.6.1 Conditional PMF

Definition 4.12 (Conditional PMF)


For any event Y = y such that PY (y) > 0, the conditional PMF of X given Y = y is

PX|Y (x|y) = P [X = x|Y = y]. (4.18)

Theorem 4.22
For random variables X and Y with joint PMF PX,Y (x, y), and x and y such that
PX (x) > 0 and PY (y) > 0,

PX,Y (x, y) = PX|Y (x|y)PY (y) = PY |X (y|x)PX (x).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 55 / 74
4.6 Conditioning by a Random Variable 4.6.2 Conditional Expected Value of a Function

4.6.2 Conditional Expected Value of a Function

Theorem 4.23 (Conditional Expected Value of a Function)


X and Y are discrete random variables. For any y ∈ SY , the conditional expected value
of g(X, Y ) given Y = y is
X
E[g(X, Y )|Y = y] = x ∈ SX g(x, y)PX|Y (x|y).

Note
The conditional expected value of X given Y = y is a special case of Theorem 4.23:
X
E[X|Y = y] = xPX|Y (x|y).
x∈SX

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 56 / 74
4.6 Conditioning by a Random Variable 4.6.2 Conditional Expected Value of a Function

4.6.2 Conditional Expected Value of a Function

Theorem 4.23 (Conditional Expected Value of a Function)


X and Y are discrete random variables. For any y ∈ SY , the conditional expected value
of g(X, Y ) given Y = y is
X
E[g(X, Y )|Y = y] = x ∈ SX g(x, y)PX|Y (x|y).

Note
The conditional expected value of X given Y = y is a special case of Theorem 4.23:
X
E[X|Y = y] = xPX|Y (x|y).
x∈SX

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 56 / 74
4.6 Conditioning by a Random Variable 4.6.3 Conditional PDF

4.6.3 Conditional PDF

Definition 4.13 (Conditional PDF)


For y such that fY (y) > 0, the conditional PDF of X given {Y = y} is

fX,Y (x, y)
fX|Y (x|y) = . (4.19)
fY (y)

Note
Definition 4.13 implies
fX,Y (x, y)
fY |X (y|x) = . (4.20)
fX (x)

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 57 / 74
4.6 Conditioning by a Random Variable 4.6.3 Conditional PDF

4.6.3 Conditional PDF

Definition 4.13 (Conditional PDF)


For y such that fY (y) > 0, the conditional PDF of X given {Y = y} is

fX,Y (x, y)
fX|Y (x|y) = . (4.19)
fY (y)

Note
Definition 4.13 implies
fX,Y (x, y)
fY |X (y|x) = . (4.20)
fX (x)

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 57 / 74
4.6 Conditioning by a Random Variable 4.6.3 Conditional PDF

4.6.3 Conditional PDF

Example 4.11
Random variables X and Y have joint PDF
(
2, 0 < y < x < 1,
fX,Y (x, y) =
0, otherwise.

For 0 < x < 1, find the conditional PDF fY |X (y|x). For 0 < y < 1, find the conditional
PDF fX|Y (x|y).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 58 / 74
4.6 Conditioning by a Random Variable 4.6.3 Conditional PDF

4.6.3 Conditional PDF

Example 4.11 Solution


For 0 < x < 1, Theorem 4.8 implies
+∞
Z Zx
fX (x) = fX,Y (x, y)dy = 2dy = 2x.
−∞ 0

The conditional PDF of Y given X is


(
fX,Y (x, y) 1/x, 0 < y < x,
fY |X (y|x) = =
fX (x) 0, otherwise.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 59 / 74
4.6 Conditioning by a Random Variable 4.6.3 Conditional PDF

4.6.3 Conditional PDF

Example 4.11 Solution (continuous)


Given X = x, we see that Y is the uniform (0, x) random variable. For 0 < y < 1,
Theorem 4.8 implies
+∞
Z Z1
fY (y) = fX,Y (x, y)dx = 2dx = 2(1 − y).
−∞ y

Furthermore,
(
fX,Y (x, y) 1/(1 − y), y < x < 1,
fX|Y (x|y) = =
fY (y) 0, otherwise.

Conditioned on Y = y, we see that X is the uniform (y, 1) random variable.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 60 / 74
4.6 Conditioning by a Random Variable 4.6.3 Conditional PDF

4.6.3 Conditional PDF

Theorem 4.24

fX,Y (x, y) = fY |X (y|x)fX (x) = fX|Y (x|y)fY (y).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 61 / 74
4.6 Conditioning by a Random Variable 4.6.4 Conditional Expected Value of a Function

4.6.4 Conditional Expected Value of a Function

Definition 4.14 (Conditional Expected Value of a Function)


For continuous random variables X and Y and any y such that fY (y) > 0, the
conditional expected value of g(X, Y ) given Y = y is
+∞
Z
E[g(X, Y )|Y = y] = g(x, y)fX|Y (x|y)dx.
−∞

Note
The conditional expected value of X given Y = y is a special case of Definition 4.14:
+∞
Z
E[X|Y = y] = xfX|Y (x|y)dx.
−∞

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 62 / 74
4.6 Conditioning by a Random Variable 4.6.5 Conditional Expected Value

4.6.5 Conditional Expected Value

Definition 4.15 (Conditional Expected Value)


The conditional expected value E[X|Y ] is a function of random variable Y such that if
Y = y then E[X|Y ] = E[X|Y = y].

Example 4.12
For random variables X and Y in Example 4.19, we found in that the conditional PDF
of X given Y is
(
fX,Y (x, y) 1/(1 − y), y < x < 1,
fX|Y (x|y) = =
fY (y) 0, otherwise.

Find the conditional expected values E[X|Y = y] and E[X|Y ].

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 63 / 74
4.6 Conditioning by a Random Variable 4.6.5 Conditional Expected Value

4.6.5 Conditional Expected Value

Definition 4.15 (Conditional Expected Value)


The conditional expected value E[X|Y ] is a function of random variable Y such that if
Y = y then E[X|Y ] = E[X|Y = y].

Example 4.12
For random variables X and Y in Example 4.19, we found in that the conditional PDF
of X given Y is
(
fX,Y (x, y) 1/(1 − y), y < x < 1,
fX|Y (x|y) = =
fY (y) 0, otherwise.

Find the conditional expected values E[X|Y = y] and E[X|Y ].

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 63 / 74
4.6 Conditioning by a Random Variable 4.6.5 Conditional Expected Value

4.6.5 Conditional Expected Value

Example 4.12 Solution


Given the conditional PDF fX|Y (x|y), we perform the integration
+∞
Z
E[X|Y = y] = xfX|Y (x|y)dx
−∞

Z1
1 x2 x=1 1+y

= xdx = = .
1−y 2(1 − y) x=y 2

y

Since E[X|Y = y] = (1 + y)/2, E[X|Y ] = (1 + Y )/2.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 64 / 74
4.7 Independent Random Variables 4.7.1 Definitions

4.9.1 Definitions

Definition 4.16 (Independent Random Variables)


Random variables X and Y are independent if and only if

Discrete: PX,Y (x, y) = PX (x)PY (y),


Continuous: fX,Y (x, y) = fX (x)fY (y).

Note
Theorem 4.22 implies that if X and Y are independent discrete random variables, then

PX|Y (x|y) = PX (x), PY |X (y|x) = PY (y).

Theorem 4.24 implies that if X and Y are independent continuous random variables,
then

fX|Y (x|y) = fX (x), fY |X (y|x) = fY (y).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 65 / 74
4.7 Independent Random Variables 4.7.1 Definitions

4.9.1 Definitions

Definition 4.16 (Independent Random Variables)


Random variables X and Y are independent if and only if

Discrete: PX,Y (x, y) = PX (x)PY (y),


Continuous: fX,Y (x, y) = fX (x)fY (y).

Note
Theorem 4.22 implies that if X and Y are independent discrete random variables, then

PX|Y (x|y) = PX (x), PY |X (y|x) = PY (y).

Theorem 4.24 implies that if X and Y are independent continuous random variables,
then

fX|Y (x|y) = fX (x), fY |X (y|x) = fY (y).

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 65 / 74
4.7 Independent Random Variables 4.7.1 Definitions

4.7.1 Definitions

Example 4.13
(
4xy, 0 ≤ x ≤ 1, 0 ≤ y ≤ 1,
fX,Y (x, y) =
0, otherwise.
Are X and Y independent?

Example 4.13 Solution


The marginal PDFs of X and Y are
( (
2x, 0 ≤ x ≤ 1, 2y, 0 ≤ y ≤ 1,
fX (x) = , fY (y) =
0, otherwise 0, otherwise.

It is easily verified that fX,Y (x, y) = fX (x)fY (y) for all pairs (x, y) and so we conclude
that X and Y are independent.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 66 / 74
4.7 Independent Random Variables 4.7.1 Definitions

4.7.1 Definitions

Example 4.13
(
4xy, 0 ≤ x ≤ 1, 0 ≤ y ≤ 1,
fX,Y (x, y) =
0, otherwise.
Are X and Y independent?

Example 4.13 Solution


The marginal PDFs of X and Y are
( (
2x, 0 ≤ x ≤ 1, 2y, 0 ≤ y ≤ 1,
fX (x) = , fY (y) =
0, otherwise 0, otherwise.

It is easily verified that fX,Y (x, y) = fX (x)fY (y) for all pairs (x, y) and so we conclude
that X and Y are independent.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 66 / 74
4.7 Independent Random Variables 4.7.2 Properties

4.9.2 Properties

Theorem 4.27
For independent random variables X and Y ,
(a) E[g(X)h(Y )] = E[g(X)]E[h(Y )],
(b) rX,Y = E[XY ] = E[X]E[Y ],
(c) Cov[X, Y ] = ρX,Y = 0,
(d) V ar[X + Y ] = V ar[X] + V ar[Y ],
(e) E[X|Y = y] = E[X] for all y ∈ SY ,
(f) E[Y |X = x] = E[Y ] for all x ∈ SX .

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 67 / 74
4.7 Independent Random Variables 4.7.2 Properties

4.7.2 Properties

Example 4.14
Random variables X and Y have a joint PMF given by the following matrix

PX,Y (x, y) y = −1 y=0 y=1


x = −1 0 0.25 0
x=1 0.25 0.25 0.25

Are X and Y independent? Are X and Y uncorrelated?

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 68 / 74
4.7 Independent Random Variables 4.7.2 Properties

4.9.2 Properties

Example 4.14 Solution


For the marginal PMFs, we have PX (−1) = 0.25 and PY (−1) = 0.25. Thus

PX (−1)PY (−1) = 0.0625 6= PX,Y (−1, −1) = 0,

and we conclude that X and Y are not independent.


To find Cov[X, Y ], we calculate

E[X] = 0.5, E[Y ] = 0, E[XY ] = 0.

Therefore, Theorem 4.16(a) implies

Cov[X, Y ] = E[XY ] − E[X]E[Y ] = ρX,Y = 0,

and by definition X and Y are uncorrelated.

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 69 / 74
4.7 Independent Random Variables 4.7.2 Properties

Practice Test

Question 4.1
A code word contains 5 digits, each either 0 or 1: to be valid the word must contain
exactly three 1’s and two 0’s. One word is selected at random from the valid code words.
Deflne X to be the flrst (left most digit) and let Y be the second digit in the word
selected.
(a) Define the joint PMF and PDT for (X, Y ).
(b) Find the marginal probability of X and Y .
(c) Are X and Y independent?
(d) Find E[X], E[Y ], V ar[X], V ar[Y ].

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 70 / 74
4.7 Independent Random Variables 4.7.2 Properties

Practice Test

Question 4.2
A certain drugstore has three public telephone booths. The probability that the booth is
occupied at a given time is identical for each booth, and is denoted by p, which is equal
to 0.2. Let X and Y denote the number of booths that will be occupied at two
independent time.
(a) Define the joint PMF and PDT for (X, Y ).
(b) Find P [X = Y ].
(c) Find P [X > Y ].

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 71 / 74
4.7 Independent Random Variables 4.7.2 Properties

Practice Test

Question 4.3
Suppose that in a certain drug, the concentration of a particular chemical is a random
variable with a continuous distribution for which the PDF g is as follows

 3 x2 , 0 ≤ x ≤ 2,
g(x) = 8
0, otherwise.

Suppose the concentrations X and Y of the chemical in two seperate batches of the
drug are independent random variables for each of the PDF is g. Determine
(a) The joint PDF of X and Y .
(b) Find P [X > Y ].

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 72 / 74
4.7 Independent Random Variables 4.7.2 Properties

Practice Test

Question 4.4
Suppose that 2 persons make an appointment to meet between 5pm and 6 pm at a
certain location, and they agree that neither person will wait more than 10 minutes for
the other person. If they arrive indepently at random times between 5pm and 6 pm,
what is the probability that they meet?

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 73 / 74
4.7 Independent Random Variables 4.7.2 Properties

Practice Test

Question 4.5
Suppose that a person’s score X on a mathematics aptitude test is a number between 0
and 1, and that his score Y on a music aptitude test is a number between 0 and 1.
Suppose further that in the population of a college students in the United States, the
scores X and Y are distributed according to the following joint PDF
(
k(2x + 3y), 0 < x < 1, 0 < y < 1,
fX,Y (x, y) =
0, otherwise.

(a) Find k?
(b) What proportion of a college students obtain a score greater than 0.8 on the
mathematics test?
(c) If a student’s score on the music test is 0.3, what is the probability that his score
on the mathametics test will be greater than 0.8?
(d) If a student’s score on the mathematics test is 0.3, what is the probability that his
score on the music test will be greater than 0.8?

Nguyễn Thị Thu Thủy (SAMI-HUST) Pairs of Random Variables HANOI – 2019 74 / 74

You might also like