Professional Documents
Culture Documents
• Assuming that 1 fair coin was tossed twice, we find that the sample space for
our experiment is
• Since the 4 sample points are all equally likely, it follows that
• Where a typical element, say TH, indicates that the first toss resulted in a tail
followed by a head on the second toss. Now, these probabilities are just the
relative frequencies for the given events in the long run. Therefore,
• This result means that a person who tosses 2 coins over and over again will,
on the average, get 1 head per toss.
Cont.
The method described above for calculating the expected number of
heads per toss of 2 coins suggests that the mean, or expected value, of
any discrete random variable may be obtained by multiplying each of the
values of the random variable X by its corresponding probability
Note that the way to calculate the expected value, or mean, shown here is
different from the way to calculate the sample mean described in
introduction course, where the sample mean is obtained by using data.
hours.
Variance of a Random Variable
• The mean, or expected value, of a random variable X is of special
importance in statistics because it describes where the probability
distribution is centered.
• By itself, however, the mean does not give an adequate description of the
shape of the distribution. We also need to characterize the variability in
the distribution.
• Compute variance of X
• Solution: Recall that expected value of X
OR
but
Then
Example
Solution:
Then
Expectation of Two Dimensional Random Variables
Solution:
Properties of expected value and variance of random variables
OR 𝜎𝑋𝑌 = 𝐶𝑜𝑣 𝑋, 𝑌 = 𝐸 𝑋𝑌 − 𝐸 𝑋 𝐸 𝑌 = 𝐸 𝑋𝑌 − 𝜇𝑋 𝜇𝑌
Example
Example
Consider the following joint probability mass function
𝑥
𝑃(𝑋, 𝑌) 0 1 2 (𝑦)
𝑦 0 3/28 9/28 3/28 15/28
1 3/28 3/14 0 3/7
2 1/28 0 0 1/28
𝑔(𝑥) 5/14 15/28 3/28 1
Its magnitude will depend on the units used to measure both X and Y.
𝜎𝑋𝑌
𝜌𝑋𝑌 =
𝜎𝑋 𝜎𝑌
It should be clear to the reader that 𝜌𝑋𝑌 is free of the units of X and Y. The correlation coefficient
satisfies −1 ≤ 𝜌𝑋𝑌 ≤ 1. It assumes a value of zero when 𝜎𝑋𝑌 = 0.
Note:
If 𝜌𝑋𝑌 approaches to negative one, then X and Y are strongly negatively related
If 𝜌𝑋𝑌 approaches to positive one, then X and Y are strongly positively related
If 𝜌𝑋𝑌 is equal to zero, then X and Y are not linearly related. But not imply X and Y are
independent but if X and Y are independent, then necessarily 𝜌𝑋𝑌 is equal to zero
Example: Consider the following joint probability mass function
𝑥
𝑃(𝑋, 𝑌) 0 1 2 (𝑦)
𝑦 0 3/28 9/28 3/28 15/28
1 3/28 3/14 0 3/7
2 1/28 0 0 1/28
𝑔(𝑥) 5/14 15/28 3/28 1
2
27 3 27 9 45
𝛿𝑋 = 𝐸 𝑋2 − (𝐸 𝑋 )2 = − ( )2 = − =
28 4 28 16 112
2
4 4 1 9
𝛿𝑌 = 𝐸 𝑌2 − (𝐸 𝑌 )2 = − (1/2 )2 = − =
7 7 4 28
Therefore, correlation between X and Y is
9
𝜎𝑋𝑌 − 1
𝜌𝑋𝑌 = = 56 = −
𝜎𝑋 𝜎𝑌 45 9 5
112 ∗ 28
Exercise