You are on page 1of 16

MULTIPLE RANDOM VARIABLES

Joint CDF
Joint PMF / PDF
Joint MGF /CF
Two-Dimensional Random Variables

 So for we have considered only one-dimensional random variable.


 In many situations, we are interested in recording two or more
outcomes of a random experiment. For example, both voltage and
current might be of interest in a certain experiment.

Definition: Let Ω be a sample space associated with a random


experiment 𝐸. Let 𝑋 = 𝑋(ω) and 𝑌 = 𝑌(ω) be two functions each
assigning a real number to each outcome ω ∈ Ω. Then (𝑋, 𝑌) is called
a two-dimensional random variable.
Types of two-dimensional random variables
 If the possible values of (𝑋, 𝑌) are finite or countably infinite, then
(𝑋, 𝑌) is called a two-dimensional discrete random variable. When
(𝑋, 𝑌) is a two-dimensional discrete random variable, the possible
values of (𝑋, 𝑌) may be represented as 𝑥𝑖 , 𝑦𝑗 , 𝑖 = 1,2, … 𝑚;
𝑗 = 1,2, … 𝑛.

 If (𝑋, 𝑌) can assume all values in a specified region 𝑅 in 𝑥𝑦-plane,


then (𝑋, 𝑌) is called a two-dimensional continuous random variable.
Example for two-dimensional random variables
Joint Cumulative Distribution Function (Joint CDF)

For a random variable 𝑋, we define the CDF as 𝐹𝑋 𝑥 = 𝑃 𝑋 ≤ 𝑥 .


Now, if we have two random variables 𝑋 and 𝑌 and we would like
to study them jointly, we can define the joint cumulative function as
follows:
The joint cumulative distribution function of two random
variables 𝑋 and 𝑌 is defined as 𝐹𝑋𝑌 𝑥, 𝑦 = 𝑃(𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦).
As usual, comma means "and," so we can write
𝐹𝑋𝑌 𝑥, 𝑦 = 𝑃 𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦 = 𝑃((𝑋 ≤ 𝑥) and(𝑌 ≤ 𝑦))
= 𝑃( 𝑋 ≤ 𝑥 ∩ 𝑌 ≤ 𝑦 ).
Joint Cumulative Distribution Function (Joint CDF), Continued…
Note that the above definition of joint CDF is a general definition
and is applicable to discrete and continuous random variables.
Since the joint CDF refers to the probability of an event, we must
have 0 ≤ 𝐹𝑋𝑌 𝑥, 𝑦 ≤ 1.
Note that,
1. 𝐹𝑋𝑌 +∞, +∞ = 1.
2. 𝐹𝑋𝑌 −∞, −∞ = 0.
Marginal Distributions:
Marginal Cumulative Distribution Function (Marginal CDF)
When 𝑋 and 𝑌 are jointly-distributed random variables, we may want to
consider only one of them, say 𝑋. In that case we need to find the CDF (or
PMF or CDF) of 𝑋 without 𝑌 . This is called a marginal CDF (or PMF or CDF).
How do we find the Marginal CDF?

If we know the joint CDF of 𝑋 and 𝑌, then we can find the marginal CDFs,
𝐹𝑋 𝑥 and 𝐹𝑌 𝑦 .
For any 𝑥 ∈ 𝑅, we have 𝐹𝑋𝑌 𝑥, ∞ = 𝑃(𝑋 ≤ 𝑥, 𝑌 ≤ ∞)=𝑃 𝑋 ≤ 𝑥 =𝐹𝑋 𝑥 .
Here, 𝐹𝑋𝑌 𝑥, ∞ = lim𝑦→∞ 𝐹𝑋𝑌 𝑥, 𝑦 .
Similarly, for any 𝑦 ∈ 𝑅, 𝐹𝑋𝑌 ∞, 𝑦 = 𝑃(𝑋 ≤ ∞, 𝑌 ≤ 𝑦)=𝑃 𝑌 ≤ 𝑦 =𝐹𝑌 𝑦 .
Here, 𝐹𝑋𝑌 ∞, 𝑦 = lim𝑥→∞ 𝐹𝑋𝑌 𝑥, 𝑦 .
Joint Probability Mass Function (Joint PMF)

For a discrete random variable 𝑋, we define the PMF as


𝑝𝑋 𝑥 = 𝑃 𝑋 = 𝑥 .
Now, if we have two random variables 𝑋 and 𝑌 and we would like
to study them jointly, we can define the joint probability mass
function as follows:
The joint probability mass function of two random variables 𝑋 and
𝑌 is defined as 𝑝𝑋𝑌 𝑥, 𝑦 = 𝑃(𝑋 = 𝑥, 𝑌 = 𝑦).
As usual, comma means "and," so we can write
𝑝𝑋𝑌 𝑥, 𝑦 = 𝑃 𝑋 = 𝑥, 𝑌 = 𝑦 = 𝑃((𝑋 = 𝑥) and(𝑌 = 𝑦))
Marginal Probability Mass Function (Marginal PMF)
Marginal Probability Mass Function (Marginal PMF)
The joint PMF contains all the information regarding the distributions
of 𝑋 and 𝑌. This means that, for example, we can obtain PMF of 𝑋 from
its joint PMF with 𝑌. In fact, we can write

∞ ∞

𝑝𝑋 𝑥 = 𝑃 𝑋 = 𝑥 = 𝑃(𝑋 = 𝑥, 𝑌 = 𝑦) = 𝑝𝑋𝑌 𝑥, 𝑦 .
𝑦=−∞ 𝑦=−∞

Here, 𝑝𝑋 (𝑥) is called the marginal PMF of 𝑋.


Similarly, the marginal PMF of 𝑌 is
∞ ∞

𝑝𝑌 𝑦 = 𝑃 𝑌 = 𝑦 = 𝑃(𝑋 = 𝑥, 𝑌 = 𝑦) = 𝑝𝑋𝑌 𝑥, 𝑦 .
𝑥=−∞ 𝑥=−∞
Joint Probability Density Functions (Joint PDF)
For a continuous random variable 𝑋, we define the PDF as
𝑥=𝑏
𝑃 𝑎≤𝑋≤𝑏 = 𝑓
𝑥=𝑎 𝑋
𝑥 𝑑𝑥 .
Now, if we have two random variables 𝑋 and 𝑌 and we would like to study them
jointly, we can define the joint probability density function as follows:
Consider two random variables 𝑋 and 𝑌.
The joint probability density function of 𝑋 and 𝑌 is defined as
𝑥=𝑏 𝑦=𝑑
𝑃 𝑎 ≤ 𝑋 ≤ 𝑏, 𝑐 ≤ 𝑌 ≤ 𝑑 = 𝑓
𝑥=𝑎 𝑦=𝑐 𝑋𝑌
𝑥, 𝑦 𝑑𝑥𝑑𝑦.
Marginal Probability Density Functions (Marginal PDF)
Marginal Probability Density Functions (Marginal PDF)
Consider the joint PDF between 𝑋 and 𝑌, that is 𝑝𝑋𝑌 𝑥, 𝑦 . It is easy to see

𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑦 = 𝑓𝑋 𝑥 ,
𝑦=−∞


𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑥 = 𝑓𝑌 (𝑦)
𝑥=−∞

The PDFs 𝑓𝑋 (𝑥) and 𝑓𝑌 (𝑦) are called as marginal PDFs of 𝑋 and 𝑌
Joint Moment Generating Functions (Joint MGF)

Consider two random variables 𝑋 and 𝑌. The joint MGF is defined as


∞ ∞
𝑀𝑋𝑌 𝑠, 𝑡 = 𝑒 𝑠𝑋+𝑡𝑌 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑥 𝑑𝑦
−∞ −∞
Consider multiple variables 𝑋1 , 𝑋2 , … , 𝑋𝑁 . The joint MGF is a similarly defined
function 𝑀𝑋1 ,…,𝑋𝑁 𝑠1 , … , 𝑠𝑁
Consider the joint MGF between 𝑋 and 𝑌, that is 𝑀𝑋𝑌 𝑠, 𝑡 . It is easy to see
𝑀𝑋𝑌 𝑠, 𝑡 |𝑠=0 = 𝑀𝑌 𝑡 , 𝑀𝑋𝑌 𝑠, 𝑡 |𝑡=0 = 𝑀𝑋 𝑠
The MGFs 𝑀𝑋 (𝑠) and 𝑀𝑌 (𝑡) are called as marginal MGFs of 𝑋 and 𝑌.
Joint Characteristic Functions (Joint CF)

Consider two random variables 𝑋 and 𝑌. The joint CF is defined as


∞ ∞
𝜙𝑋𝑌 𝑠, 𝑡 = 𝑒 𝑖𝑠𝑋+𝑖𝑡𝑌 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑥 𝑑𝑦
−∞ −∞
Consider multiple variables 𝑋1 , 𝑋2 , … , 𝑋𝑁 . The joint CF is a similarly defined
function 𝜙𝑋1 ,…,𝑋𝑁 𝑠1 , … , 𝑠𝑁
Consider the joint CF between 𝑋 and 𝑌, that is 𝜙𝑋𝑌 𝑠, 𝑡 . It is easy to see
𝜙𝑋𝑌 𝑠, 𝑡 |𝑠=0 = 𝜙𝑌 𝑡 , 𝜙𝑋𝑌 𝑠, 𝑡 |𝑡=0 = 𝜙𝑋 𝑠
The CFs 𝜙𝑋 (𝑠) and 𝜙𝑌 (𝑡) are called as marginal CFs of 𝑋 and 𝑌.

You might also like