Professional Documents
Culture Documents
A random var iable that has the pdf described by the following fx(x) is called a
Gamma Random Variable with parameters ;t and t (where, A> 0 and t > 0).
i '\ -)
fx (x ) =
{
Ae -lx (,1.x)C-1
r( t) x > 0
0
, 9J ~-- \
"1( l1 ~
x<0
re a 'X..~o
lJ
I 1
Recall
0
yt- 1 e-Y dy (t > O)ot.+-
f(t) = (t - l)f(t - 1)
t( 1): Je:~ ~,
v
A random variable that has a gamma d istribution with integer values of t (say, n) is
ref erred to as an Erlang (or n-Erlang) random variable with parameters ,1 and n.
Ae -lx (Ax )n-1
- - - - - x >0
fx ()
x ={ (n - 1)!
0 x <0
Gamma Random Variable (Contd.)
Recall a Gamma random variable has the pdf fx(x) (where, J > 0 and t > 0).
x> O
x< O
0.10 0. 10
0.05 0.05
X
5 l0 15 20 25 30 5 10 15 20 25 30
f
F n(t) = P(Dn < t) = P(N(t) ~ n) = L
0
f
~ =L
- k!
(-1t) ke --lt
k =n k=n
./ V.
A(At)n-l e - At A(At e -At A(At + le -At ,1(,1t)n At ,1(,1t)~ e -At
= - - - - + -------,- - - + ------,,'---- - ... - - ------ - - - -------,,<~ - • ••
(n - 1)! (n) ! (n + 1)! (n)! + 1)!
✓ ✓ ✓ ---
Ae- At (-1t)n-l
-
(n - l)!
Gamma Distribution: Relation to x~
Distribution -
Recall a Gamma random variable has the pdf fx(x) (where, A> O and t > 0).
Ae-,h (Axf- 1
fx (X) = f (t) X> Q
{
0 X < 0
A Gamma random variable with A = ~ and t = !!: (where, n is a positive integer) is said
2 2
to have a chi-squared distribution with n degrees of freed om or X~-
!f
00
This distribution is seen to arise frequently in real life possibly because it can be
shown that when many independent and identically d istributed random var iables are
added then the sum tends to have a normal distribution (this is loose statement of
the Central Limit Theorem).
P( x iy) 3
0.399
r.1:
1
0.5
M
b X b
M
µ b
M
X
+ I +
::t ::t ::t
Normal Random Variable (Contd.)
Fx(x) has to be evaluated numerically. If working out problems by hand one uses
tables to determine the area under normal probability density funJt1ion to the left ""
of X . We will come to that soon. - (~ c~-/'>-rl.JC) -/J-) -
2~-..;- o('°'.:211\,.,.,.
But before that, a few properties: e ".:p
1.,..,
(}..
--
Fy(a)
I e,
= P(Y < ~)'}
"ot-
/ P(aX + {J <a)= P
_(j~ ...~) 2 ,..v, ~d..
11 a - /3 ---
fa 1
(x < ~ -/J
(x- µ )2
a
Y=Fx_(a a :1
=---
't-:.. o(
fl..
J ~-~
7(.. -, /-:'.
= 1 e zo-2 dx I -: rf(~)tf
-JJ -oo (2rr)z(J
~
£A. ?-
___Lei y = ax+ {J then the upper limit of the integral becomes y a;
~x = (y - {J) / a; and the above integral can be rewritten as: ---
a i
1 (y-(/3+aµ))2
Fy(a) =
f
-oo
1
(2rr)2a(J
e 2a 2 cr 2 dy
Normal Random Variable (Contd.)
Fx(x) has to be evaluated numerically. If working out problems by hand one uses
tables to determine the area under normal probability density function to the left
of x . We will come to that soon.
Note,
[ cD(-x) =1- cD(x) ]
-n
np(1-
b) ---> <t> ( b) - <t> (a) "ti>-,<> )'>•
1~l~
ci-(~~
~·~~'(,_.;;;.
f ...J tnp{,-f)
A')
) - :'t-
~ n-o>( 1-f)
Normal and Binomial Distributions
If Xis the number of successes inn trials (with probability of success= p) then it
can be shown that as n ➔ oo
-n
np(1-
b) ➔ <P(b) - <P(a)
Note, in this case Xis a binomial random variable with parameters n and p.
For all practical purposes this is a good approximation when np(1 - p) > 10. (Recall,
earlier we had seen the Poisson approximation of the binomial probabilities that worked well when n is large
and np is moderate.)