Professional Documents
Culture Documents
Probability distributions.
(1) Binomial distribution
Binomial Experiment:
1) It consists of n trials
3) The probability of getting a certain outcome, say “S”, remains the same, from
trial to trial, say P(“S”)=p
4) These trials are independent, that is the outcomes from the previous trials
will not affect the outcomes of the up-coming trials
f( ) P( ) ( ) ( ) n
( ) ( )
Here ( ) ( ) ( )
Eg. An exam consists of 10 multiple choice questions. Each question has 4 possible
choices. Only 1 is correct. Jeff did not study for the exam. So he just guesses at the
right answer for each question (pure guess, not an educated guess). What is his
chance of passing the exam? That is, to make at least 6 correct answers.
Answer: Yes, this is a binomial experiment with n= , p= .25, “S”=choose the right
answer for each question.
( ) ( ) ( )
( ) ( ) ( ) ( ) ( )
1
( ) ( )
Bernoulli Distribution
ernoulli(p) . It can take on two possible values, say success (S) or failure
(F) with probability p and (1-) respectively.
That is
P( S) p P( F) p
f(x) P( ) p ( p)
That is, the Xi’s are independently, and identically distributed (i.i.d.)
If X is continuous:
x
F ( x) P( X x) f (t )dt
If X is discrete:
F(x) = P( x) ∑t x P( t)
d
f ( x) [ F ( x)]' F ( x)
dx
b
P(a X b) f ( x)dx
a
Discrete random variable (for which the p.d.f. is also called the
probability mass function, p.m.f.)
f ( x) P( X x)
P(a b) ∑a x b P( x)
2. Mathematical Expectation.
Continuous random variable: E[ g ( X )] g ( x) f ( x)dx
all
Discrete random variable: E[ g ( X )] g ( x) P( X x)
Special case:
3
1) (population) Mean: E ( X ) x f ( x)dx
2) (population) Variance:
2 E[( X )2 ] ( x )2 f ( x)dx E ( X 2 ) [ E ( X )]2
*** The above mean and variance formulas are for continuous X.
Replace the integral with the summation (over all possible values of
X), if X is discrete.
Var ( X ) E[( X )2 ] E ( X 2 2 X 2 )
EX 2 2 EX 2 EX 2 2 EX 2 ( EX )2
where t is a parameter.
d
First population moment: E ( X ) M X (t ) |t 0
dt
d2
Second population moment: E ( X 2 ) M X (t ) |t 0
dt 2
4
( x )2
1
For normal distribution, X ~ N ( , ), f ( x) 2
e 2 2
, x
2
1
t 2t 2
M X (t ) e f ( x)dx e
tx 2
f X1 , X 2 ( x1 , x2 ) f X1 ( x1 ) f X 2 ( x2 )
M X 1 X 2 (t ) M X 1 (t ) M X 2 (t )
f X 1 , X 2 ( x1 , x 2 ) f X 1 ( x1 ) f X 2 ( x 2 ) , thus
e
t ( X1 X 2 ) t ( x1 x2 )
M X 1 X 2 (t ) E (e ) f X1 , X 2 ( x1 , x 2 )dx1 dx2
e 1 f X1 ( x1 )dx1 e 2 f X 2 ( x2 )dx2 = M X1 (t )M X 2 (t )
tx tx
Note: One could use this property to identify the probability distribution
based on the moment generating function.
m.g.f. of X
5
() ( ) ∑ ( )
∑ ( ) ( )
∑( )( ) ( )
[ ( )]
2. In addition, we have (*make sure you know how to derive these expectations):
( ) np
ar( ) np( p)
Exercise:
1. Please show that the summation of n independent Bernoulli random variables
each with success probability p, will follow B(n, p). Please show the entire
derivation for full credit.
Solution:
∑ni
Then, () ( ) ( i ) ( ) ( 2 ) ( n ) [ ( )]
(3) . Since the above mgf is the same as the mgf for B(n,p), we claim that we have
proven this problem
6
7