You are on page 1of 40

JOINT PROBABILITY DISTRIBUTION OF TWO OR

MORE RANDOM VARIABLES


Intended Learning Outcomes
At the end of the discussion, the learner should be able to:
Enumerate the types of random variables and their probability
distributions
Calculate probabilities of events on each type of distribution
Construct the probability distribution of two random variables
Calculate the marginal and conditional distributions of each random
variable.
Solve for the cumulative and probability density functions of functions
of random variables
Find the density functions of random variables
Joint Probability Distribution
• Given random variables , that are defined on a probability space, the joint probability
distribution for   is a probability distribution that gives the probability that each of  falls in any
particular range or discrete set of values specified for that variable. In the case of only two random
variables, this is called a bivariate distribution, but the concept generalizes to any number of
random variables, giving a multivariate distribution.

• The joint probability distribution can be expressed either in terms of a joint cumulative distribution
function or in terms of a joint probability density function (in the case of continuous variables) or
joint probability mass function (in the case of discrete variables). These in turn can be used to find
two other types of distributions: the marginal distribution giving the probabilities for any one of the
variables with no reference to any specific ranges of values for the other variables, and
the conditional probability distribution giving the probabilities for any subset of the variables
conditional on particular values of the remaining variables.
Joint Probability Distribution
Example 1: Suppose each of two urns contains twice as many red balls as blue balls, and no others,
and suppose one ball is randomly selected from each urn, with the two draws independent of each
other. Let  and  be discrete random variables associated with the outcomes of the draw from the first
urn and second urn respectively. The probability of drawing a red ball from either of the urns is 2/3,
and the probability of drawing a blue ball is 1/3. We can present the joint probability distribution as
the following table:

A=Red B=Blue P(B)


A=Red (2/3)(2/3)=4/9 (1/3)(2/3)=2/9 4/9 + 2/9 = 2/3
B=Blue (2/3)(1/3)=2/9 (1/3)(1/3)=1/9 2/9 + 1/9 =1/3
P(A) 4/9 + 2/9=2/3 2/9 +1/9=1/3
Joint Probability Distribution
Example 1: Consider the flip of two fair coins; let and  be discrete random variables associated with
the outcomes of the first and second coin flips respectively. Each coin flip is a Bernoulli trial and has
a Bernoulli distribution. If a coin displays "heads" then the associated random variable takes the value
1, and it takes the value 0 otherwise. The probability of each of these outcomes is 1/2, so the marginal
(unconditional) density functions are:

Possible Outcomes P(A) P(B)


O1 0 0
O2 1 0
O3 0 1
O4 1 1

𝟏 𝟏 𝟏
𝑷 ( 𝑨, 𝑩 )= 𝑷 ( 𝑨 ) 𝒙 𝑷 ( 𝑩) = 𝒙 =
𝟐 𝟐 𝟒
Joint Probability Distribution
Example 1: Consider the roll of a fair die and let  if the number is even (i.e. 2, 4, or 6)
and  otherwise. Furthermore, let  if the number is prime (i.e. 2, 3, or 5) and  otherwise.

1 2 3 4 5 6
A 0 1 0 1 0 1
B 0 1 1 0 1 0

𝟏 𝟐 𝟏
𝑷 ( 𝑨=𝟎 , 𝑩=𝟎 )=𝑷 ( 𝟏 )= 𝑷 ( 𝑨=𝟏 , 𝑩=𝟎 )=𝑷 ( 𝟒 ,𝟔 )= =
𝟔 𝟔 𝟑
𝟏 𝟐 𝟏
𝑷 ( 𝑨=𝟏 , 𝑩=𝟏 )=𝑷 (𝟐)= 𝑷 ( 𝑨=𝟎 , 𝑩=𝟏 )=𝑷 ( 𝟑 , 𝟓 ) = =
𝟔 𝟔 𝟑
Joint Probability Mass Function
Let X and Y be two discrete rv’s defined on the sample space of an
experiment. The joint probability mass function p(x, y) is defined
for each pair of numbers (x, y) by

Let A be the set consisting of pairs of (x, y) values, then


Joint Probability Mass Function
The joint probability mass function of the discrete random
variables X and Y denoted as ; satisfies:
Joint Probability Mass Function
Example: In the development of a new receiver for the
transmission of digital information, each received bit is rated as
acceptable, suspect, or unacceptable, depending on the quality of
the received signal, with probabilities 0.9, 0.08, and 0.02,
respectively. Assume that the ratings of each bit are independent.
In the first four bits transmitted, let

X denote the number of acceptable bits


Y denote the number of suspect bit
Joint Probability Mass Function
Determine the following:
Joint Probability Mass Function
Solution:
Joint Probability Mass Function

Joint probability distribution of X and Y in the Example


Marginal Probability Mass Function
If more than one random variable is defined in a random experiment,
it is important to distinguish between the joint probability distribution
of X and Y and the probability distribution of each variable
individually. The individual probability distribution of a random
variable is referred to as its marginal probability distribution.
The marginal probability mass functions of X and Y, denoted pX(x)
and pY(y) are given by
Marginal Probability Mass Function
Joint Probability Mass Function
From the previous problem, determine the following:
Marginal Probability Mass Function
Solution:
Marginal Probability Mass Function

Marginal probability distributions of X and Y from the Example


Mean and Variance from Joint Distribution
Mean and Variance from Joint Distribution
Example: From the previous problem; find:
1.The mean
2.The variance
Solution
Mean and Variance from Joint Distribution
Example: From the previous problem; find:
1.The mean
2.The variance
Solution
Conditional Probability Distribution
Let X and Y be two continuous rv’s with joint pdf f (x, y) and marginal
X pdf fX(x). Then for any X value x for which fX(x) > 0, the
conditional probability density function of Y given that X = x is
𝒇 𝑿𝒀 ( 𝒙𝒚 )
𝒇 𝒀 | 𝑿 = ( 𝒚 | 𝒙 )= − ∞<𝒚 <∞
𝒇 𝑿 ( 𝒙)

If X and Y are discrete, replacing pdf’s by pmf’s gives the conditional


probability mass function of Y when X = x.
Conditional Probability Distribution
Conditional Probability Distribution
From the previous problem, determine the following:
Conditional Probability Distribution
Solution:
Conditional Probability Distribution

Conditional probability distributions for the given example


Conditional Probability Distribution
Conditional Probability Distribution
From the previous problem, determine the following:
1. Conditional mean

2. Conditional variance
Conditional Probability Distribution
From the previous problem, determine the following:
Solution:
Independence
In some random experiments, knowledge of the values of X does not change
any of the probabilities associated with the values for Y.

By analogy with independent events, we define two random variables to be


independent whenever for all x and y. Notice that independence implies that for
all x and y. If we find one pair of x and y in which the equality fails, X and Y
are not independent. If two random variables are independent,
Independence
Independence
From the previous problem, determine whether the two variables X and Y are
independent:
Solution:

Then, the two variables are dependent to each other


MULTIPLE DISCRETE RANDOM VARIABLES
Joint Probability Distribution

Marginal Probability Distribution


EXERCISES
Show that the following function 1. Determine the following probabilities
satisfies the properties of a joint a.
probability mass function. b.
c.
d.
2. Determine the mean , and variance )
and
3. Determine the marginal probability
distribution of the random variable (X)
4. Determine the conditional probability
distribution of Y given that X=1.5.
5. Determine the conditional probability
distribution of X given that Y=2
6.
EXERCISES
Show that the following function 1. Determine the following probabilities
satisfies the properties of a joint a.
probability mass function. b.
c.
d.
2. Determine the mean , and variance )
and
3. Determine the marginal probability
distribution of the random variable (X)
4. Determine the conditional probability
distribution of Y given that X=1.
5. Determine the conditional probability
distribution of X given that Y=1
6.
EXERCISES
In the transmission of digital information, the probability that a bit has high, moderate, and
low distortion is 0.01, 0.10, and 0.95, respectively. Suppose that three bits are transmitted and
that the amount of distortion of each bit is assumed to be independent. Let X and Y denote the
number of bits with high and moderate distortion out of the three, respectively. Determine:
1. The Joint Probability Mass Function
2. The Marginal Mass Function
3. The Mean and Variance of the Joint Distribution
4. The Conditional Probability Distribution
EXERCISES
A manufacturing company employs two inspecting devices to sample a fraction of
their output for quality control purposes. The first inspection monitor is able to
accurately detect 99.3% of the defective items it receives, whereas the second is able to
do so in 99.7% of the cases. Assume that four defective items are produced and sent out
for inspection. Let X and Y denote the number of items that will be identified as
defective by inspecting devices 1 and 2, respectively. Assume the devices are
independent. Determine:
1. The Joint Probability Mass Function
2. The Marginal Mass Function
3. The Mean and Variance of the Joint Distribution
4. The Conditional Probability Distribution
5. The Mean and Variance of the Conditional Probability Distribution
6. Are the Variables X and Y independent?
MULTIPLE DISCRETE RANDOM VARIABLES
Example: Suppose the random Find:
variables X, Y, and Z have the
1.
following joint probability distribution

2. Determine the conditional probability


distribution of X given that Y = 1 and Z = 2.
TWO CONTINUOUS RANDOM VARIABLES
Joint Probability Distribution
TWO CONTINUOUS RANDOM VARIABLES
Joint Probability Distribution

EXAMPLE: Let the random variable X denote the time until a computer server
connects to your machine (in milliseconds), and let Y denote the time until the
server authorizes you as a valid user (in milliseconds). Each of these random
variables measures the wait from a common starting time and X< Y. Assume that
the joint probability density function for X and Y is:

The probability that is?


END OF PRESENTATION

You might also like