You are on page 1of 4

Lecture 13

Agenda
1. Geometric Random Variable

2. Negative Binomial Distribution

Geometric Random Variable


Consider an experiment which consists of repeating independent Bernoulli
trials until a success is obtained. Assume that probability of success in each
independent trial is p.

Let X be the number of trials before the first success.

Range(X) = {0, 1, 2, 3, . . .}

P (X = x) = P ( first x trials are failure and x + 1-th trial is success)


= (1 − p)x × p
= p(1 − p)x

Hence, we get the following equality



X
p(1 − p)x = 1
x=0

Distribution function
Let’s calculate the probability distribution function.

F (x) = P (X ≤ x)
= P ( you get a success at or before the (x + 1)th trial )
= 1 − P ( first (x + 1) trials are all failures)
= 1 − (1 − p)(x+1)

1
Expectation and Variance

X
E(X) = xp(1 − p)x
x=0
= pq + 2pq 2 + 3pq 3 + 4pq 4 + . . . where q = 1 − p

Multiplying both sides by q,

qE(X) = pq 2 + 2pq 3 + 3pq 4 + . . .

Hence subtracting the two equations we get,

E(X) − qE(X) = pq + (2pq 2 − pq 2 ) + (3pq 3 − 2pq 3 ) + (4pq 4 − 3pq 4 ) + . . .


(1 − q)E(X) = pq + pq 2 + pq 3 + pq 4 + . . .
pq p(1 − p)
pE(X) = = = (1 − p)
1−q p
(1 − p)
E(X) =
p

(1−p)
Arguing along the same line V (X) = p2
.

Example
I am new to basketball and so I am practising to shoot the ball through
the basket. If the probability of success at each throw is 0.2,
how many times would I fail on average before a success occurs.

Let X denote the number of failures before I get a success. Now I assume
that the individual throws are independent of each other, which might not
be exactly true in real life. Under that assumption, X ∼ Geometric(0.2).
So the average number of times I would fail before a success = E(X) = 1−0.2
0.2
= 4.

Memoryless Property of the Geometric Distribution


Theorem 1. Let i, j be positive integers. X ∼ Geometric(p). Then

P (X ≥ i + j|X ≥ i) = P (X ≥ j)

2
Proof. First we notice that for x > 0,

P (X ≥ x) = 1 − P (X ≤ x − 1)
= 1 − (1 − (1 − p)(x−1)+1 )
= (1 − p)x

Note that the formula holds for x = 0 also.

P ({X ≥ i + j} ∩ {X ≥ i})
P (X ≥ i + j|X ≥ i) =
P (X ≥ i)
P (X ≥ i + j)
=
P (X ≥ i)
(1 − p)(i+j)
=
(1 − p)i
= (1 − p)j
= P (X ≥ j)

In other words, if I told you that there have been i failures initially, the
chance of atleast j more failures before the the first success; is exactly same
as if you started the experiment for the first time and the information of
initial i failures is not given to you.

CAUTION The textbook defines Geometric Random Variable in a slightly


different way. You don’t have to worry about this, but if you are following
the textbook then you may remember that if we say X ∼ Geometric(p) then
textbook will say Y ∼ Geometric(p), where Y = X + 1. It’s a matter of
definition.

Negative Binomial Distribution


The Geometric random variable corresponds to the number of failures before
the first success in a sequence of independent Bernoulli trials. The Negative
Binomial Random Variable corresponds to the number of failures before
the r-th success, for some positive integer r. So let the Bernoulli trials have
probability of success p.

3
X = number of failures before the r-th success
X is our Negative Binomial random variable with parameters r and p.
We observe that, Range(X) = {0, 1, 2, 3, . . .}.

Pmf
For x ≥ 0,

P (X = x) = P ( there will be x failures before the r-th success )


= P ( in the first x + r − 1 trials there will be x failures and r − 1 successes
and x + r-th trial is a success )
 
x + r − 1 (r−1)
= p (1 − p)x × p
x
 
x+r−1 r
= p (1 − p)x
x

Homework :: We shall follow our definition of Geometric Distribution.

• For X ∼ Geometric(p), find E(X(X−1)) using methods similar


for E(X); then deduce the formula for V (X).

• 3.72, 3.78

• For 3.76, the expression should be P (Y ≥ y0 ) ≥ 0.1


p
• For 3.77, you have to show P (Y = an even integer ) = 1−q 2

You might also like