You are on page 1of 41

Chapter 5: Joint

probability
distributions

1
5.1 Jointly distributed r.v.s
•discrete
  r.v. The joint pmf is defined for each pair by

with

Event is associated with the pair of values according to some rule (e.g. , or
Then:

2
5.1 Jointly distributed r.v.s
• 
Example: R.v.s

3
5.1 Jointly distributed r.v.s
• 
Marginal probabilities
The marginal pmf of , denoted by is given by

(similar for the marginal pmf of , )

Marginal pmf of : it is the pmf of , if we do not consider (we disregard) the


values of .

4
5.1 Jointly distributed r.v.s
• 
Marginal probabilities example (cont.)

5
5.1 Jointly distributed r.v.s
• 
Continuous r.v.s
continuous r.v.s. The joint pdf is a function with

and

If , then

6
5.1 Jointly distributed r.v.s
• 
Example: A bank has drive-up and walk-up services
: percentage of time the drive-up service is used (out of 8 hours)
: percentage of time the walk-up service is used (out of 8 hours)


• for

7
5.1 Jointly distributed r.v.s
• 
Example (cont.):

8
5.1 Jointly distributed r.v.s
• 
Marginal pdf
The marginal pdf of , denoted by is given by

(similar for the marginal pdf of , )

Marginal pdf of : it is the pDf of , if we do not consider the values of .

9
5.1 Jointly distributed r.v.s
• 
Marginal pdf example (cont. – bank):  

10
5.1 Jointly distributed r.v.s
• 
Independent r.v.s
are independent iff
• Discrete case:
• Continuous case:
for all

• If independent, then there is ABSOLUTELY no relationship between them

11
5.1 Jointly distributed r.v.s
• 
Independent r.v.s example (cont. – bank):

• ,
 
• R.V.s are dependent

12
5.1 Jointly distributed r.v.s
• 
Example:
A machine has 2 components that have independent lifetimes (denoted as and ), both
following exponential distributions with parameters. The joint pdf (the pdf of the machine
life) is

 
Let (expected lifetimes hours and hours respectively).
 

13
5.1 Jointly distributed r.v.s
• 
Probability distributions for more than 2 variables
• Discrete case:
• Continuous case: , such that

Independence
• R.v.s. are independent if for every subset the joint pmf or pdf is equal to
the product of the marginal pmfs or pdfs

14
5.1 Jointly distributed r.v.s
• 
Multinomial experiment (generalization of the binomial)
There are trials, each with possible outcomes (not just 2), each with
probability .
: r.v.s denoting the number of trials resulting in outcome .
It can be shown that:

15
5.1 Jointly distributed r.v.s
• 
Multinomial experiment example
25% of the students are from Mathematics, 25% of the students are from
Mechanical Engineering, 50% of the students are from Computer Science.
I pick 10 students at random

16
5.1 Jointly distributed r.v.s
• 
Conditional distributions
continuous r.v.s with joint pdf , and marginal pdfs .
For any value of , with , the conditional probability density function of , given
that is:

• Discrete case: replace with

• Conditional expectation

17
5.1 Jointly distributed r.v.s
• 
Example: Proportion of time drive-up or walk-up service is used (see slide 7)
: percentage of time the drive-up service is used (out of 8 hours)
: percentage of time the walk-up service is used (out of 8 hours)

The conditional pdf of for is:

for
18
5.1 Jointly distributed r.v.s
• 
Example (cont.):

From the marginal we would get

19
5.1 Jointly distributed r.v.s
• 
Example (cont.):
• From the marginal

• From the conditional

• The 2 r.v.s are not independent

20
5.2 Expected values,
covariance,
•jointly
 
correlation
distributed r.v.s with pmf , or pdf . Then the expected value of a
function is

21
5.2 Expected values,
• 
covariance,
COVARIANCE
correlation
2 r.v.s . Covariance is a measure of their relationship:

 
• are deviations from the mean, so covariance is the expected value of the product of
deviations

• Shortcut formula

22
5.2 Expected values,
• 
covariance,
Example: (See slide 3)
correlation
 

  
     
     

23
5.2 Expected values,
• 
The
covariance,
idea
correlation
What is the relation of the values of and ?

• CASE 1: When positive (negative), then positive (negative), so positive (always).


Furthermore would be positive and large.

• CASE 2: When negative (positive), then positive (negative), so negative (always).


Furthermore would be negative and large.

• CASE 3: When negative (positive), then either positive or negative (negative or positive),
so negative or positive. Furthermore would be negative or positive and small.

24
5.2 Expected values,
• 
The
covariance,
idea
correlation
What is the relation of the values of and ?

CASE 1 CASE 2 CASE 3

25
5.2 Expected values,
• 
covariance,
Correlation
correlation
2 r.v.s .

 
It is “normalized” covariance
Properties:
• If and both positive or negative, then

• If independent, then (the reverse is not always true)


• , iff for some numbers

26
5.2 Expected values,
• 
covariance,
Example:
correlation
(See slides 3 and 25)
 

  
     
     

27
5.3-4. Sampling distributions
•  The sample mean is itself a r.v.!!!!
• is the r.v., is the value (e.g. we can have )
• NOTE: Before we gather the data, the observations are also r.v.s (uppercase)

28
5.3-4. Sampling distributions
•   be a random sample from a distribution with mean and standard
Let
deviation . Then

is a r.v. (linear combination of r.v.s.) with:


1.
2.

• Also, if , then

29
5.3-4. Sampling distributions
•   Sampling distribution
The

The probability distribution (pmf, pdf) of a sample statistic ( ) is called a


sampling distribution.

• As increases, decreases (the more data we have, the less variable the value
of the sample mean)
• is also called the standard error of the mean (or of the sample statistic in
general)

30
5.3-4. Sampling distributions
• 
Example:

• Population mean entry level salary for engineering degrees is with

• If I take a sample of , the sample mean will have and . If , then

31
5.3-4. Sampling distributions
•   normal distribution (r.v. )
The

• Let be a random sample from a normal distribution . Then for any

• Since follows normal distribution we can estimate !!!!

32
5.3-4. Sampling distributions
• 
Example: In an experiment rats try to navigate through a maze. Assume that the time (in
minutes) needed is a r.v. We have a sample of rats. What is the probability that the
average time of the rats is less than minutes?

What would we have for 1 rat?

33
5.3-4. Sampling distributions
•   CENTRAL LIMIT THEOREM
THE
a random sample from ANY distribution with mean and standard deviation .
If is large (let’s say ), then is approximately normal with and
(also, )

34
5.3-4. Sampling distributions
• 
Example: Waiting time at a bank follows exponential distribution with rate
: waiting time, with
 
What is the probability the average waiting time of 36 customers is more than 3 minutes?

35
5.5. Distribution of linear
combination
•   there be r.v.s and constants , and let
Let

be a linear combination of the .

NOTE: 1) for
2) could have different distributions with

36
5.5. Distribution of linear
• 
combination
Proposition:
• r.v.s with

1.

2. if independent then

3. In general

37
5.5. Distribution of linear
• 
combination
Example: A gas station sells 3 grades of gas: Regular, Extra and Super, priced per gallon.
is the amount sold during a day in gallons (per grade)
Assume that are independent with
, and
The total revenue for a day is:

Then

38
5.5. Distribution of linear
combination
•  special case:
A
.
Then (difference of 2 r.v.s)
0.2

0.1

0
0 20 40 60 80
If are independent then X1

0.2

0.1

NOTE: 0
0 20 40 60 80
X2
• The variance of the difference is the SUM 0.2
of the two variances, NOT their difference. 0.1

• Adding/subtracting r.v.s INCREASES variability 0


0 20 40 60 80
(spread of the data). X1-X2

39
5.5. Distribution of linear
• 
combination
Example:
A car manufacturer equips a particular model with 6 or 4 cylinder engines
the fuel efficiency for the 2 types with
, and
Then:

40
5.5. Distribution of linear
• 
combination
Example (gas station, slide 43):
Assume normally distributed and independent.

What is the probability that the total revenue of a day will be more than ?

 
NOTE: The central limit theorem holds for linear combinations also. (i.e., if there is a large
number of r.v.s that are not normally distributed, their sum is still approximately normal)
41

You might also like