You are on page 1of 3

Joint Distributions and Independence https://www.probabilitycourse.com/chapter6/6_1_1_joint_distributions...

6.1.1 Joint Distributions and Independence


For three or more random variables, the joint PDF, joint PMF, and joint CDF are defined in a similar way to
what we have already seen for the case of two random variables. Let X1 , X2 , ⋯, Xn be n discrete random
variables. The joint PMF of X1 , X2 , ⋯, Xn is defined as

PX1 ,X2 ,...,Xn (x1 , x2 , . . . , xn ) = P(X1 = x1 , X2 = x2 , . . . , Xn = xn ).


For n jointly continuous random variables X1 , X2 , ⋯, Xn , the joint PDF is defined to be the function
fX1 X2 ...Xn (x1 , x2 , . . . , xn ) such that the probability of any set A ⊂ ℝn is given by the integral of the PDF
over the set A. In particular, for a set A ∈ ℝn , we can write

( ) ∫ ∫ ∫
P (X1 , X2 , ⋯ , Xn ) ∈ A = ⋯ ⋯ fX1 X2 ⋯Xn (x1 , x2 , ⋯ , xn )dx1 dx2 ⋯ dxn .
A

The marginal PDF of Xi can be obtained by integrating all other Xj 's. For example,
∞ ∞

∫−∞ ∫−∞
fX1 (x1 ) = ⋯ fX1 X2 ...Xn (x1 , x2 , . . . , xn )dx2 ⋯ dxn .

The joint CDF of n random variables X1 , X2 ,...,Xn is defined as

FX1 ,X2 ,...,Xn (x1 , x2 , . . . , xn ) = P(X1 ≤ x1 , X2 ≤ x2 , . . . , Xn ≤ xn ).

Example 6.1
Let X, Y and Z be three jointly continuous random variables with joint PDF

⎧ c(x + 2y + 3z) 0 ≤ x, y, z ≤ 1

fXYZ (x, y, z) = ⎨

⎩0 otherwise
1. Find the constant c .
2. Find the marginal PDF of X .

• Solution
◦ 1.

1 of 3 10/12/22, 3:56 pm
Joint Distributions and Independence https://www.probabilitycourse.com/chapter6/6_1_1_joint_distributions...

∞ ∞ ∞

∫−∞ ∫−∞ ∫−∞


1= fXYZ (x, y, z)dxdydz
1 1 1

∫0 ∫0 ∫0
= c(x + 2y + 3z) dxdydz

c( + 2y + 3z) dydz
1 1
1
∫0 ∫0
=
2

c( + 3z) dz
1
3
∫0
=
2
= 3c.
1
Thus, c = 3 .
2. To find the marginal PDF of X , we note that RX = [0, 1]. For 0 ≤ x ≤ 1, we can write
∞ ∞

∫−∞ ∫−∞
fX (x) = fXYZ (x, y, z)dydz
1 1
1
∫0 ∫0
= (x + 2y + 3z) dydz
3
1
1
∫0 3
= (x + 1 + 3z) dz

= (x + ) .
1 5
3 2
Thus,

⎧ 1 (x + 5 ) 0≤x≤1
⎪3 2
fX (x) = ⎨

⎩0 otherwise

Independence: The idea of independence is exactly the same as what we have seen before. We restate it here in
terms of the joint PMF, joint PDF, and joint CDF. Random variables X1 , X2 , . . . , Xn are independent, if for
all (x1 , x2 , . . . , xn ) ∈ ℝn ,

FX1 ,X2 ,...,Xn (x1 , x2 , . . . , xn ) = FX1 (x1 )FX2 (x2 ) ⋯ FXn (xn ).
Equivalently, if X1 , X2 , ..., Xn are discrete, then they are independent if for all (x1 , x2 , . . . , xn ) ∈ ℝn , we
have

PX1 ,X2 ,...,Xn (x1 , x2 , . . . , xn ) = PX1 (x1 )PX2 (x2 ) ⋯ PXn (xn ).

2 of 3 10/12/22, 3:56 pm
Joint Distributions and Independence https://www.probabilitycourse.com/chapter6/6_1_1_joint_distributions...

If X1 , X2 , ..., Xn are continuous, then they are independent if for all (x1 , x2 , . . . , xn ) ∈ ℝn , we have
fX1 ,X2 ,...,Xn (x1 , x2 , . . . , xn ) = fX1 (x1 )fX2 (x2 ) ⋯ fXn (xn ).
If random variables X1 , X2 , ..., Xn are independent, then we have

E[X1 X2 ⋯ Xn ] = E[X1 ]E[X2 ] ⋯ E[Xn ].


In some situations we are dealing with random variables that are independent and are also identically
distributed, i.e, they have the same CDFs. It is usually easier to deal with such random variables, since
independence and being identically distributed often simplify the analysis. We will see examples of such
analyses shortly.
Definition 6.1. Random variables X1 , X2 , ..., Xn are said to be independent and identically distributed (i.i.d.)
if they are independent, and they have the same marginal distributions:

FX1 (x) = FX2 (x) =. . . = FXn (x),  for all x ∈ ℝ.

For example, if random variables X1 , X2 , ..., Xn are i.i.d., they will have the same means and variances, so
we can write

E[X1 X2 ⋯ Xn ] = E[X1 ]E[X2 ] ⋯ E[Xn ] (because the Xi 's are indepenednt)


= E[X1 ]E[X1 ] ⋯ E[X1 ] (because the Xi 's are identically distributed)
= E[X1 ]n .
← previous
next →

The print version of the book is available through Amazon here.

3 of 3 10/12/22, 3:56 pm

You might also like