Professional Documents
Culture Documents
Random Variable
A random variable Y Y is said to be discrete if it can assume only a finite
or countably infinite number of distinct values. The probability that Y
takes on the value y, P (Y =yy), is defined as the sum of the probabilities
of all sample points in S that are assigned the value y. We will sometimes
denote P (Y = y) by p(y). The probability distribution for a discrete
variable Y can be represented by a formula, a table, or a graph that
provides p(y) = P (Y = y) for all y.
non-zero probability.
Expected Value
P
It is defined to be E(Y ) = y yp(y) = µ. The expected value has the
following properties
E(c) = c. P P
Proof:E(c) = y cp(y) = c y p(y) = c × 1 = c.
E(cg(y) = cE(g(y)).P P
Proof: E(cg(y)) = y cg(y)p(y) = c y g(y)p(y) = cE(g(y)).
E( ki=1 gi (y)) = ki=1 E(gi (y)).
P P
Proof: let k = 2, then
P P
E(g1 (y) + g2P
(y)) = y (g1 (y)P+ g2 (y))p(y) = y g1 (y)p(y) +
g2 (y)p(y) = y g1 (y)p(y) + y g2 (y)p(y) = E(g1 (y)) + E(g2 (y)).
Varince
It is defined to be V ar(Y ) = E(Y − µ)2 = E(Y 2 ) − µ2 .
Proof:V ar(Y ) = E(Y − µ)2 = E(Y 2 − 2Y µ + µ2 ) =
E(Y 2 ) − 2µE(Y ) + E(µ2 ) = E(Y 2 ) − 2µ2 + µ2 = E(Y 2 ) − µ2 .
A Binomial Variable
A random variable Y is said to have a binomial distribution based on n
trials with success probability p if and only if
E(X) = M 0 (0) = np
To find the variance, we find E(X 2 ) using the moment generating function
= npq
p(3) = c10 3
3 0.4 0.6
7
(b) What is the probability that at most two will favour Jones?
Sol:
(c) What is the probability that at least three will favour Jones?
Sol:
P (Y ≥ 3) = 1 − P (Y ≤ 2)
Maram Salem & Noha Youssef (AUC) 9 / 32
Example 1
(d) What is the probability that all will not favour Jones?
Sol:
P (Y = 0).
(e) How many voters would you expect to favour Jones?
Sol:
E(Y ) = np = 10 ∗ 0.4 = 4voters.
(f) Find the standard deviation
Sol: √ √
p √
var(Y ) = npq = 10 ∗ 0.4 ∗ 0.6 = 2.4.
1 2 1 2
P (Y < 2) = p(0) + p(1) = c50 ( )0 ( )5 + c51 ( )1 ( )4 .
3 3 3 3
A Geometric Variable
A random variable Y is said to have a geometric probability distribution if
and only if
p(y) = q y−1 p, y = 1, 2, 3, · · · , 0 ≤ p ≤ 1.
P (Y ≥ 5) = 1 − P (Y ≤ 4)
= 1 − 0.3 + 0.3 ∗ 0.7 + 0.3 ∗ 0.72 + 0.3 ∗ 0.73
= 1 − 0.7599 = 0.2401.
Maram Salem & Noha Youssef (AUC) 14 / 32
Example 3
(c) How many interviews would you expect to do to find the first
applicant with advanced training in programming? What is the
standard deviation?
Sol:
r r
1 1 q 0.7
E(Y ) = = = 3.3 and sd = = = 2.867.
p 0.3 p2 0.32
r −r−1
M 0 (t) = (1 − p)ret pet 1 − (1 − p)et
r−1 −r
+ pret pet 1 − (1 − p)et
(1 − p)r + pr
At t=0 =
p
r
M 0 (0) = E(X) = .
p
σ 2 = E X 2 − (E[X])2
r −r−2
M 00 (t) = r pet (−r − 1) 1 − (1 − p)et −(1 − p)et
r−1 −r−1
+ r2 pet pet 1 − (1 − p)et
distribution
A Hyper-Geoemtric Variable
A random variable Y is said to have a hyper-geometric probability
distribution if and only if
N −r
cry cn−y
p(y) =
cN
n
where y is an integer 0, 1, 2, · · · , n, subject to the restrictions y ≤ r and
n − y ≤ N − r. that is: y = 0,1,2,...., min.(r,n)
The hypergeometric distribution has a mean, E(Y ) = nr N , variance of
var(Y ) = n( Nr ( NN−r )( N −n
N −1 )), and the moment generating function does
not exists in closed form.
c40 c16
5 c4 c16
P (Y > 1) = 1−P (Y ≤ 1) = 1−p(0)−p(1) = 1− 20 − 1 204 = 0.2487.
c5 c5
A Poisson Variable
A random variable Y is said to have a Poisson probability distribution if
and only if
λy −λ
p(y) = e , y = 0, 1, 2, · · · λ > 0.
y!
The expected value of the Poisson distribution is E(Y ) = λ and the
variance is var(Y ) = λ.
t
The moment generating function of the Poisson Distribution is eλ(e −1) .
To get the mean using the moment generating function we get the first
derivative first
t
M 0 (t) = eλ(e −1) .λ(et )
and then replacing t with zero
M 0 (t)|t=0 = 1.λ = λ.
As a rule of Thumb:
n > 20 and p < 0.05
For those situations in which n is large and p is very small, the Poisson
distribution can be used to approximate the binomial distribution. The
larger the n and the smaller the p, the better is the approximation.
We break down the right hand side into three different parts The first part
1 n! 1 n(n − 1) · · · , (n − y + 1)
lim =
n→∞ ny (n − y)!y! ny y!
we have y brackets so therefore
each having awe have
limit ny to 1
equal
1
= 1× .
y!
The second part
λ n
lim (1 − ) = e−λ .
n→∞ n
P (Y ≥ 2) = 1 − P (Y ≤ 1).
3 exactly five customers arrive?
Sol:
P (Y = 5).
t
If the moment generating function of Y is given as e3(e −1) Find P (Y < 2).
Sol:
λ = 3 and P (Y < 2) = P (Y ≤ 1) = p(0) + p(1).
By the uniqueness of the mgf, the distribution is Poisson