You are on page 1of 11

Gamma Random Variable

A random var iable that has the pdf described by the following fx(x) is called a
Gamma Random Variable with parameters ;t and t (where, A> 0 and t > 0).
i '\ -)
fx (x ) =
{
Ae -lx (,1.x)C-1
r( t) x > 0
0
, 9J ~-- \
"1( l1 ~
x<0
re a 'X..~o
lJ

I 1

Recall

The gamma function f(t) is defined as J


00

0
yt- 1 e-Y dy (t > O)ot.+-
f(t) = (t - l)f(t - 1)
t( 1): Je:~ ~,
v

When t takes integer values, say n, f(n) = (n - 1)!

A random variable that has a gamma d istribution with integer values of t (say, n) is
ref erred to as an Erlang (or n-Erlang) random variable with parameters ,1 and n.
Ae -lx (Ax )n-1
- - - - - x >0
fx ()
x ={ (n - 1)!
0 x <0
Gamma Random Variable (Contd.)
Recall a Gamma random variable has the pdf fx(x) (where, J > 0 and t > 0).

x> O
x< O

In the following diagram a is the same as t and J = ¼-


f(x) f(x)
~
0.25 </ = 114 0.25 t
1= ➔
0.20 0.20
f=l
0.15 0.15

0.10 0. 10

0.05 0.05

X
5 l0 15 20 25 30 5 10 15 20 25 30

Source: Hogg, Tannis and Z immermann


Erlang Random Variable: Relation to Poisson
Let Dn be the t ime one has to wait for the nth occur rence of an event in a Poisson
process with occurrence rate is -1 (where, A> o) l: v, ~ \H~
-\
Nor~ ~... , ~ :A ~~
r 1 ~s)v-.
If number of occurrences of the event in time t, say, N(t) is greater than or equal
to n then surely, Dn must be less than or equal to t .

f
F n(t) = P(Dn < t) = P(N(t) ~ n) = L
0
f
~ =L
- k!
(-1t) ke --lt

k =n k=n

f 0 11 (t) can be obtained by differentiat ing the above. Hence,


-,--_-_-_-_-_-_- _-_-_--- -4-----::-;:::=:;;:==----'1"'::::===----.

./ V.
A(At)n-l e - At A(At e -At A(At + le -At ,1(,1t)n At ,1(,1t)~ e -At
= - - - - + -------,- - - + ------,,'---- - ... - - ------ - - - -------,,<~ - • ••
(n - 1)! (n) ! (n + 1)! (n)! + 1)!
✓ ✓ ✓ ---
Ae- At (-1t)n-l
-
(n - l)!
Gamma Distribution: Relation to x~
Distribution -
Recall a Gamma random variable has the pdf fx(x) (where, A> O and t > 0).
Ae-,h (Axf- 1
fx (X) = f (t) X> Q
{
0 X < 0

A Gamma random variable with A = ~ and t = !!: (where, n is a positive integer) is said
2 2
to have a chi-squared distribution with n degrees of freed om or X~-

X~ distribution is an important distribution in statistical applications. We will see


why, later.
Example 13. 1
A roadside vendor selling puffed rice and fritters for people returning
home from office always comes with 10 packets, sits till all the 10 are
sold and then leaves. Given that customers arrive according to the Poisson
distribution with arrival rate of 30 per hour and each customer buys
exactly one packet, what is the probability that the vendor has to sit for
(i) more than 10 minutes to sell the first two packets, and (ii) more than t ~
hour before he/she can leave? G-rlM, (: ~-e- ~ l-) \-r
(a) il = ~ customers per minute; n = 2; hence, (}) ~ fv\ - \ ) )
2 .,, ✓ ✓ ~ '-' ,
~ 12 _ (12 t )Z-l
- e
_!t
2 - t
te -2 '-
H~ ''L"'
"(" '
V)
,f,·~ ·
~
fv2 (t) = (2 - 1)! - 4

!f
00

So P(D 2 > 10) can be obtained as te -½dt = 0.29


10
(b) il = ~2 customers per minute; n = 10; hence,
/✓ ,.___
12 e _ (12 t )~-l
_!t
2
t
t 9e-2
fD1o (t) = (10 - l)! - 210(9!)
00

So P(D10 > 60) can be obtained as 2 10


:
9
!) J t 9 e - ! dt
60
Normal Random Variable
A random variable that has the pdf described by the following fx(x) is called a
Normal Random Variable with parameters µ and CJ 2 •
1 (x- µ )2
fx(x) = 1 e za2 -oo<x<oo
(2rr)2CJ

This distribution is seen to arise frequently in real life possibly because it can be
shown that when many independent and identically d istributed random var iables are
added then the sum tends to have a normal distribution (this is loose statement of
the Central Limit Theorem).

P( x iy) 3
0.399
r.1:
1

0.5

M
b X b
M
µ b
M
X
+ I +
::t ::t ::t
Normal Random Variable (Contd.)
Fx(x) has to be evaluated numerically. If working out problems by hand one uses
tables to determine the area under normal probability density funJt1ion to the left ""
of X . We will come to that soon. - (~ c~-/'>-rl.JC) -/J-) -
2~-..;- o('°'.:211\,.,.,.
But before that, a few properties: e ".:p
1.,..,

The normal distribution is symmetric aboutµ.

If X is normal random variable with parameters µ and (J 2 then random variable


Y = ax+ {J is distributed normally with parameters aµ._ + {J and a 2 (J 2 .

(}..
--
Fy(a)
I e,
= P(Y < ~)'}
"ot-
/ P(aX + {J <a)= P
_(j~ ...~) 2 ,..v, ~d..
11 a - /3 ---
fa 1
(x < ~ -/J
(x- µ )2
a
Y=Fx_(a a :1
=---
't-:.. o(
fl..
J ~-~
7(.. -, /-:'.

= 1 e zo-2 dx I -: rf(~)tf
-JJ -oo (2rr)z(J
~
£A. ?-
___Lei y = ax+ {J then the upper limit of the integral becomes y a;
~x = (y - {J) / a; and the above integral can be rewritten as: ---
a i
1 (y-(/3+aµ))2
Fy(a) =
f
-oo
1
(2rr)2a(J
e 2a 2 cr 2 dy
Normal Random Variable (Contd.)
Fx(x) has to be evaluated numerically. If working out problems by hand one uses
tables to determine the area under normal probability density function to the left
of x . We will come to that soon.

But before that, a few properties:


The normal distribution is symmetric aboutµ.

If X is normal random variable with parameters µ and (J 2 then random variable


Y = ax+ {J is distributed normally with parameters aµ+ {J and a 2 (J 2 .

Fy(a) = P(Y < a) = P(aX + fJ <


a-{3
a) =P (X < a-a {J) = Fx (a -a {J)
a 1 (x- µ )2
=
f
-oo (2rr)z(J
1
e 2a
2
dx

__Lei y = ax+ {J then the upper limit of the integral becomes y = a;


~ x = (y - {J) / a; and the above integral can be rewritten as:
a
1 (y-(/3+aµ))2 1 (y-(/3+a µ ))2
Fy(a) =
f
-oo
1
(2n)°za(J
e 2
2a a 2
dy or, fy(y) = 1
(2rr)2a(J
e za 2a 2
Normal Random Variable (Contd.)
If X is normal random variable with parameters µ and CJ 2 then random variable
Y = ax+ f3 is distributed normally with parameters aµ+ f3 and a 2 CJ 2 .

It follows that, if a = .: and /J = - !!:. or, alternatively, Z = x-µ then Z is


a a a
distributed normally with parameters O and 1. Such a random variable is said to
have a standard normal distribution.

It is customary to denote F2 (x) as cD(x). That is, ~.,.C~)


1 Ix z2 f
F2 (x) = ct>(x) = 1
e- 2 dz
(2rr) z - oo

Note,
[ cD(-x) =1- cD(x) ]

Fx(a) = P(X < a) = P ( X-µ


CJ <
a-µ)
CJ ( a- µ)
=P Z < - -
(J
=cl>
(a - µ)
(J
Normal and Binomial Distributions
If X is the number of successes in n trials (with probability of success= p) then it
can be shown that as n ➔ oo

-n
np(1-
b) ---> <t> ( b) - <t> (a) "ti>-,<> )'>•

1~l~
ci-(~~
~·~~'(,_.;;;.
f ...J tnp{,-f)
A')

) - :'t-
~ n-o>( 1-f)
Normal and Binomial Distributions
If Xis the number of successes inn trials (with probability of success= p) then it
can be shown that as n ➔ oo

-n
np(1-
b) ➔ <P(b) - <P(a)
Note, in this case Xis a binomial random variable with parameters n and p.

For all practical purposes this is a good approximation when np(1 - p) > 10. (Recall,
earlier we had seen the Poisson approximation of the binomial probabilities that worked well when n is large
and np is moderate.)

You might also like