Professional Documents
Culture Documents
Chapter 3
Engineers
Discrete Random
Variables and
Probability
Distributions
zInstructor:
Ivo Dinov,
Assistant:
http://www.stat.ucla.edu/~dinov/
Slide 1
Slide 2
Random Variable
3.1
Random
Variables
Slide 3
Slide 4
Slide 5
3.2
Probability Distribution
Probability
Distributions for
Discrete Random
Variables
Slide 7
Slide 8
p( y)
y: y x
Proposition
For any two numbers a and b with a b,
P (a X b) = F (b) F (a )
a represents the largest possible X
value that is strictly less than a.
Note: For integers
P (a X b) = F (b) F (a 1)
Slide 11
P(X = x)
0.13
0.15
0.17
0.20
0.15
0.11
0.09
Find
a. P ( X 0 )
0.65
b. P ( 3 X 1)
0.67
Slide 12
3.3
Expected Values of
Discrete Random
Variables
Slide 13
pr(x )
1
8
5
8
2
1
8
3
1
8
E ( X ) = x P( x )
x
1
5
1
1
=0 + 1 + 2 + 3
8
8
8
8
=1.25
Slide 15
Slide 17
E( X ) = X =
x p ( x)
xD
Slide 14
Example
P(x =X)
0.08
0.28
0.38
0.16
0.06
0.03
0.01
E ( X ) = x1 p1 + x2 p2 + ... + xn pn
=1.97
About 2 credit cards
Slide 16
Slide 18
12
1
18
2
20
4
22
1
24
2
25
3
Probability
.08
.15
.31
.08
.15
.23
= 21
2
Slide 19
Slide 20
V ( X ) = 13.25
V ( X ) = 2 = x 2 p( x) 2
D
= 13.25 3.64
( )
= E X 2 E ( X )
Slide 21
2
2
2
V (aX + b) = aX
+b = a X
and aX +b = a X
This leads to the following:
2.
2
aX
= a 2 X2 , aX
X2 +b = X2
Slide 23
Slide 22
Rules of Variance
1.
= V (X )
X = X2
= V (X )
Value
Frequency
V ( X ) = p1 ( x1 ) + p2 ( x2 ) + ... + pn ( xn )
E(aX + b) = a E(X) +b
= a X
Slide 24
E(aX + b) = a E(X) +b
SD(aX +b) = |a| SD(X)
n
E(aX + b) = (a x + b) P(X = x) =
x=0
n
n
a x P(X = x) + b P(X = x) =
x=0
x=0
n
n
a x P(X = x) + b P(X = x) =
x=0
x=0
aE(X) + b 1 = aE(X) + b.
Slide 25
E(aX + b) = a E(X) +b
E(aX + b) = a E(X) +b
E(aX + b) = a E(X) +b
-E.g., say the rules for the game of chance we saw before change and
the new pay-off is as follows: {$0, $1.50, $3}, with probabilities of
{0.6, 0.3, 0.1}, as before. What is the newly expected return of the
game? Remember the old expectation was equal to the entrance fee of
$1.50, and the game was fair!
Y = 3(X-1)/2
{$1, $2, $3} {$0, $1.50, $3},
E(Y) = 3/2 E(X) 3/2 = 3 / 4 = $0.75
And the game became clearly biased. Note how easy it is to compute E(Y).
Slide 27
Slide 28
3.4
z Variances:
Independent Variables {X1, X2, X3, , X10}, variances add-up
Var(X1 +X2 + X3 + + X10) =
Var(X1)+Var(X2)+Var(X3)++Var(X1)
Dependent Variables {X1, X2}
Variance contingent on the variable dependences,
E.g., If X2 = 2X1 + 5,
The Binomial
Probability
Distribution
Slide 30
Binomial Experiment
An experiment for which the following
four conditions are satisfied is called a
binomial experiment.
1. The experiment consists of a
sequence of n trials, where n is fixed
in advance of the experiment.
Slide 31
Binomial Experiment
Suppose each trial of an experiment can
result in S or F, but the sampling is
without replacement from a population of
size N. If the sample size n is at most 5%
of the population size, the experiment can
be analyzed as though it were exactly a
binomial experiment.
Slide 33
Slide 35
Slide 34
Computation of a
Binomial pmf
n x
n x
p (1 p )
b ( x; n, p ) = p
Slide 36
x = 0,1, 2,...n
otherwise
4 1 3
1 4 4
0.422
x = 0, 1, 2, n
Slide 37
b ( y ; n, p )
y =0
0.237
Stat 110A, UCLA, Ivo Dinov
Slide 38
= np = 5 = 1.25
4
1 3
V ( X ) = npq = 5 = 0.9375
4 4
8
0
8
( 0.82 ) ( 0.18 ) 0.0000011
0
c. at least 6 pass.
8
6
2 8
7
1 8
8
0
( 0.82 ) ( 0.18) + ( 0.82 ) ( 0.18) + ( 0.82 ) ( 0.18)
6
7
8
Slide 40
3.5
Hypergeometric and
Negative Binomial
Distributions
Slide 42
Slide 43
Hypergeometric Distribution
If X is the number of Ss in a completely
random sample of size n drawn from a
population consisting of M Ss and (N M)
Fs, then the probability distribution of X,
called the hypergeometric distribution, is
given by
M N M
Hypergeometric
Mean and Variance
E( X ) = n
M
N
M
N n
V (X ) =
n
N
N 1
M
1
N
x
nx
P( X = x) = h( x; n, M , N ) =
N
n
max(0, n N + M ) x min(n, M )
Slide 45
Slide 46
Slide 48
x + r + 1 r
x
nb( x; r , p ) =
p (1 p)
r
Negative Binomial
Mean and Variance
E( X ) =
r (1 p )
p
V (X ) =
r (1 p )
p2
x = 0, 1, 2,
Slide 49
N
N
HyperGeom( x; N , n, M )
approaches
M / N=p
Bin( x; n, p)
Slide 51
Slide 50
Geometric, Hypergeometric,
Negative Binomial
z Negative binomial pmf [X ~ NegBin(r, p), if r=1
Geometric (p)]
P( X = x) = (1 p ) x p
th
Number of trials (x) until the r success (negative, since
number of successes (r) is fixed & number of trials (X) is random)
x + r 1 r
P( X = x) =
p (1 p ) x
1
r
r (1 p )
r (1 p )
; Var ( X ) =
E( X ) =
p2
p
Slide 52
Poisson Distribution
3.6
The Poisson
Probability
Distribution
Slide 53
Slide 54
Slide 55
Poisson Distribution
Mean and Variance
If X has a Poisson distribution with
parameter , then
E( X ) = V ( X ) =
Slide 56
Poisson Process
3 Assumptions:
1. There exists a parameter > 0 such
that for any short time interval of
length t , the probability that exactly
one event is received is t + o ( t ) .
Slide 57
Poisson Distribution
Pk (t ) = e t ( t ) k / k !, so that the number
of pulses (events) during a time interval
of length t is a Poisson rv with parameter
= t. The expected number of pulses
(events) during any such time interval is t ,
so the expected number during a unit
time interval is .
Slide 59
Slide 58
Slide 60
10
http://www.nucmed.buffalo.edu
Stat 110A, UCLA, Ivo Dinov
Slide 61
Slide 62
15O
18
Slide 63
N
N
HyperGeom( x; N , n, M )
approaches
M / N=p
Bin( x; n, p)
E (Y )= k
k e
k!
k =0
= e
k =1
Slide 65
Slide 64
k 1
(k 1)!
k
kk
= e
=
k = 0 k!
k =1 ( k 1)!
= e
= e
k =0
Slide 66
k!
= e e =
Stat 110A, UCLA, Ivo Dinov
11
Y2 = Var (Y )= (k ) 2
k =0
k e
k!
, k = 0, 1, 2,
= ... =
n
n
y
y!
n pn
z Thus, Binomial(n, pn) Poisson()
6 7 8 9 10 11 12 13 14 15
Slide 68
Slide 69
1 2 3 4 5
n>=100
p<=0.01
=n p <=20
Slide 70
2.5 z 2.5
e =
z!
z =0
0
1
2
2.5 2.5 2.5 2.5 2.5 2.5
1
e +
e +
e = 0.456
1!
2!
0!
P ( Z > 2) = 1 P ( Z 2) = 1
Slide 71
12