You are on page 1of 47

Chapter

3. Discrete Random Variables


3.1 Introduc:on and Discrete Random Variables


If two fair dice are tossed, a probability is assigned to


each of the 36 possible pairs:
P({(i,j)})=1/36 for each pair (i,j) (i,j=1,...,6 integers).

If we are interested in the sum of two dice, then it


makes sense to “replace” the (36-element) sample
space S of all pairs (i,j) with the set of all possible
sums:


1
Discrete random variables

Defini:on of random variable:


A random variable (r.v.) is a real-valued func:on whose


domain is the sample space S. It is usually denoted by
uppercase leRers, such as X, Y or Z.

2
Discrete random variables

A random variable which can only take on a finite, or at


most countably infinite, number of values (e.g. the
integers) is said to be discrete.

A random variable which can take on an uncountably


infinite set of values (e.g. an interval) is said to be
con7nuous (this set of values may be bounded or
not).

3
Discrete random variables

For a discrete random variable X, its probability mass


func7on (pmf) gives the probability that X is equal to
each value a:

Suppose X must assume one of the values
then

4
Discrete random variables
Example : Independent trials, consis:ng of the flipping
of a coin having probability p of coming up Heads, are
performed un:l either a Heads occurs, or a total of n
flips is made. Let X be the r.v. deno:ng the number of
:mes that the coin is flipped. The p.m.f. for X is:





Discrete random variables
Example: Three balls are are chosen from an urn containing 3
white, 3 red, and 5 black balls. Suppose we win $1 for each
white ball selected, and lose $1 for each red one. Probability
that we win some money?
Solu:on: Let X = total winnings, taking values







Discrete random variables
It is oaen instruc:ve to present the probability mass
func:on in a graphical format, plobng the mass
against the values of the random variable.
Cumula:ve distribu:on func:on
Defini:on of Cumula:ve Distribu:on Func:on (cdf)
For a random variable X, the func:on F defined by: for each a,
, is called the cumula:ve
distribu:on func:on (cdf). The pmf can be determined
uniquely from the cdf and vice versa.
Example. Suppose that X takes the values 0, 1, 3 and 4 with
respec:ve probabili:es 0.1, 0.3, 0.4 and 0.2. Sketch the
distribu:on func:on.

8
Cumula:ve distribu:on func:on
Proper:es of the Cumula:ve Distribu:on Func:on
All probability ques:ons about X can be answered in
terms of the cdf F. For example,

Proper:es of cdf:
1.  F is non-decreasing, that is,
2. 
3.  F is right-con:nuous. That is, for any b and any
decreasing sequence converging to b,

9
Cumula:ve distribu:on func:on

Remark: Conversely, any func:on that sa:sfies the 3
proper:es above is the cdf of some r.v. (but not
necessarily discrete).

10
Cumula:ve distribu:on func:on

Example: The cumula:ve distribu:on func:on (cdf) of


the random variable X is given by
Expecta:on
6/10/2022

Expected Value
The expected value, or expecta7on, or (informally)
mean, refers to the “average” value of a random
variable (in a probabilis:c sense). For a discrete
random variable, the expected value of a random
variable X is defined as


12
Expecta:on
Example: Find E[X], where X is the outcome when we
roll a fair die.
Solu:on: Since

Example: We say that I is the indicator variable for the
event A if

Since
That is, the expecta7on of the indicator variable of an
event is equal to the probability that the event occurs.

Expecta:on

Expecta:on of Func:on of a Random Variable

Example: Let X denote a random variable that takes on the values


-1, 0, 1 with respec:ve probabili:es

Find the expecta:on of
Solu:on: Let , the pmf of Y is given by

Hence,

14
Expecta:on

Proposi:on: If X is a discrete random variable that takes on


one of the values with respec:ve probabili:es
t then for any real valued func:on g,


Applying this proposi:on to the previous example,

15
Expecta:on

Corollary: If a and b are constants, E[aX+b]=aE[X]+b.


Proof:




Note: More generally (linearity property),

16
Variance

Variance
Another useful value summarizing the probability mass
func:on of a r.v. is its “spread” or “variance”.
It is e.g. important in finance where investors look for
investments which not only have good expected
returns, but are also not too risky.

17
Variance

Example: The following two random variables have both


expected value 0, but very different spreads:
Y=0 with probability 1
Z=-100 with probability .5, Z=100 with probability .5

Defini:on: A commonly used measure of spread is the
variance of a random variable, defined as
, where .

18
Variance

An alterna:ve expression for Var(X):





That is,
Example (con:nued). Compute the variances of Y and Z.
Since


19
Variance

Example: Compute Var(X), if X is the outcome when a


fair die is rolled.

Solu:on: it was shown previously that .


Also,

Hence,

Variance
Remarks on Variance:
(1) A useful iden:ty is that for any constants a and b,

Proof:


(2) The square root of Var(X) is called the standard devia3on
of X, and it is usually denoted by sd(X) or std(X).
(3) The variance of a constant is 0.
(4) Var(.) is not a linear func:on, you can see this from (1).
Also,
21
Variance
Example: The manager of a stockroom in a factory knows from his
study of records that the daily demand for a certain tool has
the following probability distribu:on:
Demand 0 1 2
Probability 0.1 0.5 0.4
Lebng X denote the daily demand, find E(X) and Var(X).

Solu:on:

Variance
Example: The manager of a stockroom in a factory knows from his
study of records that the daily demand for a certain tool has
the following probability distribu:on:
Demand 0 1 2
Probability 0.1 0.5 0.4
Suppose that it costs $10 each :me the tool is used, find the
mean and variance of the daily cost.

Solu:on:


Moments

Defini:on: n-th moment of a random variable


The n-th moment of a random variable X is defined as


the expecta:on , where n is a posi:ve integer. In
par:cular, the expecta:on E[X] of X is the first
moment of X, and is the second moment of X.

The n-th central moment (moment about the mean) of a
random variable X is defined as the expecta:on

end 6/10/2022

24
Special distribu:ons: Bernoulli

3.2 Special discrete distribu:ons


Bernoulli/Binomial
Consider a random experiment whose outcome can be
classified as either a success or a failure. Let the
random variable X be equal to 1 when the outcome is
a success, and equal to 0 when the outcome is a
failure. Then the pmf of X is determined by one
parameter p:

25
Special distribu:ons: Bernoulli


A random variable X that has a pmf as above is denoted
as , where p is called the
parameter of the distribu:on.

26
Special distribu:ons: Binomial

Binomial random variables:


Suppose that n independent trials are performed, each of which
results in a success with probability p, and a failure with
probability 1-p. If X represents the number of successes which
occur in n trials, then X is said to be a Binomial random variable
with parameters (n,p), denoted by n = number of repeating time

Thus, a Ber(p) is the same as a Bin(1,p).

The pmf of a Binomial random variable having parameters (n,p) is


given by

27
Special distribu:ons: Binomial

We can check that the probabili:es sum to 1 using the
binomial formula:



28
Special distribu:ons: Binomial

Example: Five fair coins are flipped. If the outcomes are


assumed to be independent, find the probability mass
func:on of the number of Heads obtained.

Solu:on: (n=5, p=1/2)






Special distribu:ons: Binomial

Example: Suppose that a par:cular trait (such as the eye color, or


lea-handedness) of a person is classified on the basis of one
pair of genes, and suppose that d represents a dominant gene
and r a recessive gene. Thus a person with dd genes is pure
dominance, one with rr is pure recessive, and one with rd is
hybrid.
The pure dominance and the hybrid are alike in appearance.
Children receive 1 gene from each parent. If, with respect to a
par:cular trait, 2 hybrid parents have a total of 4 children,
what is the probability that exactly 3 of the 4 children have the
outward appearance of the dominant gene?
Special distribu:ons: Binomial

Solu:on:
Special distribu:ons: Binomial

Proper:es of binomial random variables:


Computa:on of mean and variance of a binomial r.v.
If X is a Binomial random variable with parameters (n,p),
then

32
Special distribu:ons: Binomial

33
Special distribu:ons: Poisson

Poisson random variables


Defini:on: A random variable X taking on one of the
values 0, 1, 2, ... is said to be a Poisson random
variable with parameter λ > 0 if

One can check that this indeed defines a distribu:on:


34
Special distribu:ons: Poisson

Expecta:on and variance of a Poisson random variable


The mean and the variance of a Poisson r.v. are both
equal to .

35
Special distribu:ons: Poisson

Poisson random variables for the number of events


occurring in a :me interval

One use of the Poisson distribu:on is to model the


number of “events” occurring in a certain period of
:me, e.g. the number of claims to an insurer in a
given year, or the number of :mes that a price
exceeds a predetermined value.

36
Special distribu:ons: Geometric

Geometric random variables


Suppose that independent trials are performed, each
having a probability p, 0 < p < 1, of being a success.
Let X be the number of trials required un:l the first
success. The probability mass func:on of X is

37
Special distribu:ons: Geometric

Geometric Random Variable


Defini:on: A random variable which has the above pmf
is called a Geometric random variable with parameter
p (denoted by Geom(p) or simply G(p)).


(Let q=1-p, then : do you know how to
compute this sum?)

38
Special distribu:ons: Nega:ve Binomial

Nega:ve Binomial Distribu:on


Suppose that independent trials, each having a
probability p, 0 < p < 1, of being a success, are
performed un:l r successes occur. Let X be the
random variable that denotes the number of trials
required. The probability mass func:on of X is

Defini:on: A random variable whose pmf is given by the
above is called a nega.ve Binomial random variable
with parameters (r, p).

Special distribu:ons: Nega:ve Binomial

Remarks:
(1)  A geometric random variable is a nega:ve Binomial
random variable with parameters (1,p).

(2)  A nega:ve Binomial r.v. is a sum of r independent


geometric r.v.’s with parameter p.

(3) The expecta:on and variance of a nega:ve Binomial


random variable are (no need to memorize):

Special distribu:ons: Hypergeometric

Hypergeometric Random Variables


Suppose that a sample of size n is to be chosen randomly
(without replacement) from an urn containing N balls, of
which K are white and N−K are black. Let X denote the
number of white balls selected, then


Defini:on: A random variable X whose pmf is given by the


above is said to be a Hypergeometric random variable
with parameters (n,N,K). We have: E[X]=nK/N.
41
Examples
Binomial stock price model
Suppose that currently, the stock price is 20, and that in each
period in the future, it can go up or down by 10% (with
probability p and 1-p, respec:vely). What is the price aaer 10
periods in terms of X, the number of periods when it goes up?

42
Examples

We call some event process a Poisson Process (with
parameter ) if the number of events occurring in
any interval of length t is a Poisson random variable
with parameter .

43
Examples

Example: Suppose that the number of claims to an insurer is
modeled as a Poisson process with , with 1 week as the
unit of :me.

(a)  Find the probability that at least 3 claims are received during
the next 2 weeks.
(b)  Find the probability distribu:on of the :me, star:ng from
now, un:l the next claim.

44
Examples

Example: The number of :mes that a person contracts a


cold in a given year has distribu:on Pois(5). Suppose
that a new drug can reduce it to Pois(3) for 75% of the
popula:on, but has no appreciable effect for the
remaining 25%.
If a person tries the drug and has 2 colds in that year,
how likely is it that the drug is beneficial to him/her?

45
Examples

Example: If independent trials, each resul:ng in a
success with probability p, are performed, what is the
probability of r successes occurring before m failures?

Solu:on: r successes occur before m failures if and only


if the rth success occurs before or at the (r+m-1)-th
trial. And (obviously) the rth success cannot occur
before the rth trial.
Examples

Example: An urn contains N white and M black balls. Balls


are randomly selected, one at a :me, un:l a black one is
obtained. If we assume that each selected ball is replaced
before the next one is drawn, what is the probability that:
a) exactly n draws are needed;
b) at least k draws are needed?

47

You might also like